id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_matsuo-lab__weblab-10b | 2023-09-12T11:52:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of matsuo-lab/weblab-10b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [matsuo-lab/weblab-10b](https://huggingface.co/matsuo-lab/weblab-10b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_matsuo-lab__weblab-10b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T11:50:50.938631](https://huggingface.co/datasets/open-llm-leaderboard/details_matsuo-lab__weblab-10b/blob/main/results_2023-09-12T11-50-50.938631.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26828471244991603,\n\
\ \"acc_stderr\": 0.03199381514543671,\n \"acc_norm\": 0.27186026319371354,\n\
\ \"acc_norm_stderr\": 0.03199417611519423,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301139,\n \"mc2\": 0.3601821787254854,\n\
\ \"mc2_stderr\": 0.013633932896098346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3583617747440273,\n \"acc_stderr\": 0.014012883334859859,\n\
\ \"acc_norm\": 0.39505119453924914,\n \"acc_norm_stderr\": 0.01428589829293817\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4833698466440948,\n\
\ \"acc_stderr\": 0.004987020679861267,\n \"acc_norm\": 0.6576379207329217,\n\
\ \"acc_norm_stderr\": 0.004735302937476539\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.03455473702325438,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.03455473702325438\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.02661648298050171,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.02661648298050171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.12745098039215685,\n \"acc_stderr\": 0.03318224921942077,\n\
\ \"acc_norm\": 0.12745098039215685,\n \"acc_norm_stderr\": 0.03318224921942077\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095462,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095462\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.03196766433373187,\n\
\ \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.03196766433373187\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n\
\ \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.1870967741935484,\n\
\ \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941075,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941075\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178267,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178267\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275812,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275812\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882378,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547808,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547808\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691923,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691923\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3425925925925926,\n\
\ \"acc_stderr\": 0.04587904741301809,\n \"acc_norm\": 0.3425925925925926,\n\
\ \"acc_norm_stderr\": 0.04587904741301809\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467763,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467763\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749472,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197793,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2607561929595828,\n\
\ \"acc_stderr\": 0.011213471559602332,\n \"acc_norm\": 0.2607561929595828,\n\
\ \"acc_norm_stderr\": 0.011213471559602332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.2908496732026144,\n \"acc_stderr\": 0.018373116915903966,\n \"\
acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.018373116915903966\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n\
\ \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.02947525023601718,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.02947525023601718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.03645981377388807,\n\
\ \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.03645981377388807\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301139,\n \"mc2\": 0.3601821787254854,\n\
\ \"mc2_stderr\": 0.013633932896098346\n }\n}\n```"
repo_url: https://huggingface.co/matsuo-lab/weblab-10b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-50-50.938631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-50-50.938631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-50-50.938631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-50-50.938631.parquet'
- config_name: results
data_files:
- split: 2023_09_12T11_50_50.938631
path:
- results_2023-09-12T11-50-50.938631.parquet
- split: latest
path:
- results_2023-09-12T11-50-50.938631.parquet
---
# Dataset Card for Evaluation run of matsuo-lab/weblab-10b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/matsuo-lab/weblab-10b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [matsuo-lab/weblab-10b](https://huggingface.co/matsuo-lab/weblab-10b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_matsuo-lab__weblab-10b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T11:50:50.938631](https://huggingface.co/datasets/open-llm-leaderboard/details_matsuo-lab__weblab-10b/blob/main/results_2023-09-12T11-50-50.938631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26828471244991603,
"acc_stderr": 0.03199381514543671,
"acc_norm": 0.27186026319371354,
"acc_norm_stderr": 0.03199417611519423,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301139,
"mc2": 0.3601821787254854,
"mc2_stderr": 0.013633932896098346
},
"harness|arc:challenge|25": {
"acc": 0.3583617747440273,
"acc_stderr": 0.014012883334859859,
"acc_norm": 0.39505119453924914,
"acc_norm_stderr": 0.01428589829293817
},
"harness|hellaswag|10": {
"acc": 0.4833698466440948,
"acc_stderr": 0.004987020679861267,
"acc_norm": 0.6576379207329217,
"acc_norm_stderr": 0.004735302937476539
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325438,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325438
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.02661648298050171,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.02661648298050171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.12745098039215685,
"acc_stderr": 0.03318224921942077,
"acc_norm": 0.12745098039215685,
"acc_norm_stderr": 0.03318224921942077
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095462,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095462
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1793103448275862,
"acc_stderr": 0.03196766433373187,
"acc_norm": 0.1793103448275862,
"acc_norm_stderr": 0.03196766433373187
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941075,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941075
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178267,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178267
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275812,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275812
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882378,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547808,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691923,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691923
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.04587904741301809,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.04587904741301809
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467763,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467763
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749472,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197793,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2607561929595828,
"acc_stderr": 0.011213471559602332,
"acc_norm": 0.2607561929595828,
"acc_norm_stderr": 0.011213471559602332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.018373116915903966,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.018373116915903966
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.02947525023601718,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.02947525023601718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.03645981377388807,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.03645981377388807
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301139,
"mc2": 0.3601821787254854,
"mc2_stderr": 0.013633932896098346
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__CodeEngine | 2023-09-12T11:52:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/CodeEngine
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/CodeEngine](https://huggingface.co/Undi95/CodeEngine) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__CodeEngine\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T11:51:31.235775](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CodeEngine/blob/main/results_2023-09-12T11-51-31.235775.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5433186047027702,\n\
\ \"acc_stderr\": 0.03474642991909903,\n \"acc_norm\": 0.5472592242286033,\n\
\ \"acc_norm_stderr\": 0.03472683807272391,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.45182279949368886,\n\
\ \"mc2_stderr\": 0.01544933845997067\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5494880546075085,\n \"acc_stderr\": 0.014539646098471627,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n\
\ \"acc_stderr\": 0.004832934529120793,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845443,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.03246816765752174,\n\
\ \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.019379436628919975,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.019379436628919975\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n\
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n\
\ \"acc_stderr\": 0.015842430835269407,\n \"acc_norm\": 0.7318007662835249,\n\
\ \"acc_norm_stderr\": 0.015842430835269407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n\
\ \"acc_stderr\": 0.01597266852368907,\n \"acc_norm\": 0.35195530726256985,\n\
\ \"acc_norm_stderr\": 0.01597266852368907\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543458,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38461538461538464,\n\
\ \"acc_stderr\": 0.012425548416302943,\n \"acc_norm\": 0.38461538461538464,\n\
\ \"acc_norm_stderr\": 0.012425548416302943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307296,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307296\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.034010526201040885,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.034010526201040885\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.45182279949368886,\n\
\ \"mc2_stderr\": 0.01544933845997067\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/CodeEngine
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-51-31.235775.parquet'
- config_name: results
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- results_2023-09-12T11-51-31.235775.parquet
- split: latest
path:
- results_2023-09-12T11-51-31.235775.parquet
---
# Dataset Card for Evaluation run of Undi95/CodeEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/CodeEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/CodeEngine](https://huggingface.co/Undi95/CodeEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__CodeEngine",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T11:51:31.235775](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CodeEngine/blob/main/results_2023-09-12T11-51-31.235775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5433186047027702,
"acc_stderr": 0.03474642991909903,
"acc_norm": 0.5472592242286033,
"acc_norm_stderr": 0.03472683807272391,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.45182279949368886,
"mc2_stderr": 0.01544933845997067
},
"harness|arc:challenge|25": {
"acc": 0.5494880546075085,
"acc_stderr": 0.014539646098471627,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436174
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.004832934529120793,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845443,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.019379436628919975,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.019379436628919975
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7318007662835249,
"acc_stderr": 0.015842430835269407,
"acc_norm": 0.7318007662835249,
"acc_norm_stderr": 0.015842430835269407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.01597266852368907,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.01597266852368907
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543458,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.012425548416302943,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.012425548416302943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307296,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307296
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.034010526201040885,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.034010526201040885
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.45182279949368886,
"mc2_stderr": 0.01544933845997067
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B | 2023-09-12T12:01:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ahnyeonchan/OpenOrca-AYT-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ahnyeonchan/OpenOrca-AYT-13B](https://huggingface.co/ahnyeonchan/OpenOrca-AYT-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T11:59:42.167175](https://huggingface.co/datasets/open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B/blob/main/results_2023-09-12T11-59-42.167175.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25099724682072716,\n\
\ \"acc_stderr\": 0.031540059394048775,\n \"acc_norm\": 0.2516434596660952,\n\
\ \"acc_norm_stderr\": 0.0315503001240786,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359649,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301844,\n \"\
acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423707\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n\
\ \"acc_stderr\": 0.004366488167386391,\n \"acc_norm\": 0.26030671181039633,\n\
\ \"acc_norm_stderr\": 0.004379051357024141\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106133,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106133\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387533,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387533\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04434600701584925,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04434600701584925\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.033333333333333284,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.033333333333333284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21693121693121692,\n \"acc_stderr\": 0.021227082449445055,\n \"\
acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.021227082449445055\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604672,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604672\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.22258064516129034,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.22258064516129034,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489607,\n \"\
acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489607\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275784,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275784\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23853211009174313,\n \"acc_stderr\": 0.018272575810231863,\n \"\
acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.018272575810231863\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.02971127586000536,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.02971127586000536\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29901960784313725,\n \"acc_stderr\": 0.03213325717373618,\n \"\
acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.03213325717373618\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798813,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.015329888940899867,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.015329888940899867\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468358,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095266,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095266\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.30718954248366015,\n \"acc_stderr\": 0.02641560191438898,\n\
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.02641560191438898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.02483605786829468,\n\
\ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.02483605786829468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2542372881355932,\n\
\ \"acc_stderr\": 0.011121129007840673,\n \"acc_norm\": 0.2542372881355932,\n\
\ \"acc_norm_stderr\": 0.011121129007840673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320653,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.040693063197213775,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.040693063197213775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.0258012834750905,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.0258012834750905\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359649,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n }\n}\n```"
repo_url: https://huggingface.co/ahnyeonchan/OpenOrca-AYT-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-59-42.167175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-59-42.167175.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-59-42.167175.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-59-42.167175.parquet'
- config_name: results
data_files:
- split: 2023_09_12T11_59_42.167175
path:
- results_2023-09-12T11-59-42.167175.parquet
- split: latest
path:
- results_2023-09-12T11-59-42.167175.parquet
---
# Dataset Card for Evaluation run of ahnyeonchan/OpenOrca-AYT-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ahnyeonchan/OpenOrca-AYT-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ahnyeonchan/OpenOrca-AYT-13B](https://huggingface.co/ahnyeonchan/OpenOrca-AYT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T11:59:42.167175](https://huggingface.co/datasets/open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B/blob/main/results_2023-09-12T11-59-42.167175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25099724682072716,
"acc_stderr": 0.031540059394048775,
"acc_norm": 0.2516434596660952,
"acc_norm_stderr": 0.0315503001240786,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359649,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2363481228668942,
"acc_stderr": 0.012414960524301844,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423707
},
"harness|hellaswag|10": {
"acc": 0.25801633140808605,
"acc_stderr": 0.004366488167386391,
"acc_norm": 0.26030671181039633,
"acc_norm_stderr": 0.004379051357024141
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106133,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106133
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04434600701584925,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04434600701584925
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.033333333333333284,
"acc_norm": 0.2,
"acc_norm_stderr": 0.033333333333333284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.021227082449445055,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.021227082449445055
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604672,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604672
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489607,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489607
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916648,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916648
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275784,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275784
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.018272575810231863,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.018272575810231863
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.02971127586000536,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.02971127586000536
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.03213325717373618,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.03213325717373618
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798813,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.015329888940899867,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.015329888940899867
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095266,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.02641560191438898,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.02641560191438898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.02483605786829468,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.02483605786829468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2542372881355932,
"acc_stderr": 0.011121129007840673,
"acc_norm": 0.2542372881355932,
"acc_norm_stderr": 0.011121129007840673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320653,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.040693063197213775,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.040693063197213775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.0258012834750905,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.0258012834750905
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359649,
"mc2": NaN,
"mc2_stderr": NaN
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__Orca-Nova-13B | 2023-09-12T12:07:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Orca-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Orca-Nova-13B](https://huggingface.co/TFLai/Orca-Nova-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Orca-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T12:05:50.844177](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Orca-Nova-13B/blob/main/results_2023-09-12T12-05-50.844177.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5752242847256397,\n\
\ \"acc_stderr\": 0.03431229792421787,\n \"acc_norm\": 0.5795163580926991,\n\
\ \"acc_norm_stderr\": 0.03428958992609846,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.0163555676119604,\n \"mc2\": 0.4597465231311121,\n\
\ \"mc2_stderr\": 0.015201965610936554\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868809,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407161\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.620991834295957,\n\
\ \"acc_stderr\": 0.004841486716855767,\n \"acc_norm\": 0.8247361083449513,\n\
\ \"acc_norm_stderr\": 0.003794156551272275\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934266,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.025988500792411887,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.025988500792411887\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.7445721583652618,\n \"acc_stderr\": 0.01559495538445576,\n\
\ \"acc_norm\": 0.7445721583652618,\n \"acc_norm_stderr\": 0.01559495538445576\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.02599247202930639,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.02599247202930639\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.01635341541007577,\n\
\ \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.01635341541007577\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n\
\ \"acc_stderr\": 0.028213504177824096,\n \"acc_norm\": 0.5849673202614379,\n\
\ \"acc_norm_stderr\": 0.028213504177824096\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.6463022508038585,\n \"acc_stderr\": 0.027155208103200865,\n\
\ \"acc_norm\": 0.6463022508038585,\n \"acc_norm_stderr\": 0.027155208103200865\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n\
\ \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n\
\ \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n\
\ \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.42046936114732725,\n\
\ \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.020054269200726452,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.020054269200726452\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.0163555676119604,\n \"mc2\": 0.4597465231311121,\n\
\ \"mc2_stderr\": 0.015201965610936554\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Orca-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-05-50.844177.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-05-50.844177.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-05-50.844177.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-05-50.844177.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_05_50.844177
path:
- results_2023-09-12T12-05-50.844177.parquet
- split: latest
path:
- results_2023-09-12T12-05-50.844177.parquet
---
# Dataset Card for Evaluation run of TFLai/Orca-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Orca-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Orca-Nova-13B](https://huggingface.co/TFLai/Orca-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Orca-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T12:05:50.844177](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Orca-Nova-13B/blob/main/results_2023-09-12T12-05-50.844177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5752242847256397,
"acc_stderr": 0.03431229792421787,
"acc_norm": 0.5795163580926991,
"acc_norm_stderr": 0.03428958992609846,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.0163555676119604,
"mc2": 0.4597465231311121,
"mc2_stderr": 0.015201965610936554
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868809,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407161
},
"harness|hellaswag|10": {
"acc": 0.620991834295957,
"acc_stderr": 0.004841486716855767,
"acc_norm": 0.8247361083449513,
"acc_norm_stderr": 0.003794156551272275
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934266,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411887,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411887
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.01559495538445576,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.01559495538445576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.01635341541007577,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.01635341541007577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824096,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824096
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.020054269200726452,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.020054269200726452
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.0163555676119604,
"mc2": 0.4597465231311121,
"mc2_stderr": 0.015201965610936554
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
d0rj/RuBQ_2.0-paragraphs | 2023-09-15T12:16:41.000Z | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:ru",
"language:en",
"license:cc-by-sa-4.0",
"qa",
"machine reading",
"arxiv:2005.10659",
"region:us"
] | d0rj | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: paragraphs
path: data/paragraphs-*
dataset_info:
features:
- name: uid
dtype: int64
- name: ru_wiki_pageid
dtype: int64
- name: text
dtype: string
splits:
- name: paragraphs
num_bytes: 47303369
num_examples: 56952
download_size: 24269133
dataset_size: 47303369
license: cc-by-sa-4.0
task_categories:
- question-answering
language:
- ru
- en
tags:
- qa
- machine reading
source_datasets:
- original
pretty_name: RuBQ 2.0
size_categories:
- 10K<n<100K
paperswithcode_id: rubq
---
# RuBQ_2.0-paragraphs
## Dataset Description
- **Repository:** https://github.com/vladislavneon/RuBQ/tree/master/RuBQ_2.0
- **Paper:** [RuBQ: A Russian Dataset for Question Answering over Wikidata](https://arxiv.org/abs/2005.10659)
For **test** and **dev** data see [d0rj/RuBQ_2.0](https://huggingface.co/datasets/d0rj/RuBQ_2.0) |
open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_safetensors | 2023-09-12T12:26:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of _fsx_shared-falcon-180B_converted_safetensors
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [_fsx_shared-falcon-180B_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_converted_safetensors)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_safetensors\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T12:25:36.361219](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_safetensors/blob/main/results_2023-09-12T12-25-36.361219.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6616129047646827,\n\
\ \"acc_stderr\": 0.032465318041276114,\n \"acc_norm\": 0.6652364807936633,\n\
\ \"acc_norm_stderr\": 0.032437975492240805,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6219407369869773,\n\
\ \"mc2_stderr\": 0.015400116321762768\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946524,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.681736705835491,\n\
\ \"acc_stderr\": 0.004648503177353969,\n \"acc_norm\": 0.86566421031667,\n\
\ \"acc_norm_stderr\": 0.0034031580103095565\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.031709956060406545,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.031709956060406545\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4894179894179894,\n \"acc_stderr\": 0.02574554227604549,\n \"\
acc_norm\": 0.4894179894179894,\n \"acc_norm_stderr\": 0.02574554227604549\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"\
acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.01871899852067817,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.01871899852067817\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.726890756302521,\n \"acc_stderr\": 0.028942004040998167,\n \
\ \"acc_norm\": 0.726890756302521,\n \"acc_norm_stderr\": 0.028942004040998167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659808,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659808\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8678899082568807,\n \"acc_stderr\": 0.014517801914598236,\n \"\
acc_norm\": 0.8678899082568807,\n \"acc_norm_stderr\": 0.014517801914598236\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944846,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944846\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7488789237668162,\n\
\ \"acc_stderr\": 0.029105220833224605,\n \"acc_norm\": 0.7488789237668162,\n\
\ \"acc_norm_stderr\": 0.029105220833224605\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8454661558109834,\n\
\ \"acc_stderr\": 0.012925773495095957,\n \"acc_norm\": 0.8454661558109834,\n\
\ \"acc_norm_stderr\": 0.012925773495095957\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044297,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044297\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4860335195530726,\n\
\ \"acc_stderr\": 0.016715976410744515,\n \"acc_norm\": 0.4860335195530726,\n\
\ \"acc_norm_stderr\": 0.016715976410744515\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5097783572359843,\n\
\ \"acc_stderr\": 0.012767793787729341,\n \"acc_norm\": 0.5097783572359843,\n\
\ \"acc_norm_stderr\": 0.012767793787729341\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144703,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160875,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160875\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6219407369869773,\n\
\ \"mc2_stderr\": 0.015400116321762768\n }\n}\n```"
repo_url: https://huggingface.co/_fsx_shared-falcon-180B_converted_safetensors
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-25-36.361219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-25-36.361219.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-25-36.361219.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-25-36.361219.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_25_36.361219
path:
- results_2023-09-12T12-25-36.361219.parquet
- split: latest
path:
- results_2023-09-12T12-25-36.361219.parquet
---
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_converted_safetensors
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/_fsx_shared-falcon-180B_converted_safetensors
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_converted_safetensors) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_safetensors",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T12:25:36.361219](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_safetensors/blob/main/results_2023-09-12T12-25-36.361219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6616129047646827,
"acc_stderr": 0.032465318041276114,
"acc_norm": 0.6652364807936633,
"acc_norm_stderr": 0.032437975492240805,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6219407369869773,
"mc2_stderr": 0.015400116321762768
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.013640943091946524,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907588
},
"harness|hellaswag|10": {
"acc": 0.681736705835491,
"acc_stderr": 0.004648503177353969,
"acc_norm": 0.86566421031667,
"acc_norm_stderr": 0.0034031580103095565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.031709956060406545,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.031709956060406545
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4894179894179894,
"acc_stderr": 0.02574554227604549,
"acc_norm": 0.4894179894179894,
"acc_norm_stderr": 0.02574554227604549
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.01871899852067817,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.01871899852067817
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.726890756302521,
"acc_stderr": 0.028942004040998167,
"acc_norm": 0.726890756302521,
"acc_norm_stderr": 0.028942004040998167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659808,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659808
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8678899082568807,
"acc_stderr": 0.014517801914598236,
"acc_norm": 0.8678899082568807,
"acc_norm_stderr": 0.014517801914598236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944846,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944846
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7488789237668162,
"acc_stderr": 0.029105220833224605,
"acc_norm": 0.7488789237668162,
"acc_norm_stderr": 0.029105220833224605
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8454661558109834,
"acc_stderr": 0.012925773495095957,
"acc_norm": 0.8454661558109834,
"acc_norm_stderr": 0.012925773495095957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044297,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4860335195530726,
"acc_stderr": 0.016715976410744515,
"acc_norm": 0.4860335195530726,
"acc_norm_stderr": 0.016715976410744515
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5097783572359843,
"acc_stderr": 0.012767793787729341,
"acc_norm": 0.5097783572359843,
"acc_norm_stderr": 0.012767793787729341
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144703,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160875,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160875
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6219407369869773,
"mc2_stderr": 0.015400116321762768
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DevOps-Eval/devopseval-exam | 2023-09-12T13:03:33.000Z | [
"license:mit",
"region:us"
] | DevOps-Eval | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b | 2023-09-12T12:31:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PY007/TinyLlama-1.1B-step-50K-105b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PY007/TinyLlama-1.1B-step-50K-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T12:30:04.204611](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b/blob/main/results_2023-09-12T12-30-04.204611.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2687179047639338,\n\
\ \"acc_stderr\": 0.03195135753112674,\n \"acc_norm\": 0.27059968140846513,\n\
\ \"acc_norm_stderr\": 0.03196141859030811,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474205,\n \"mc2\": 0.39509729118481285,\n\
\ \"mc2_stderr\": 0.014227368850578669\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768682,\n\
\ \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288682\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35391356303525195,\n\
\ \"acc_stderr\": 0.004772054904404421,\n \"acc_norm\": 0.4410476000796654,\n\
\ \"acc_norm_stderr\": 0.004954977202585471\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.03435568056047874,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.03435568056047874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.02694748312149622,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.02694748312149622\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181004,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181004\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.18487394957983194,\n \"acc_stderr\": 0.025215992877954202,\n\
\ \"acc_norm\": 0.18487394957983194,\n \"acc_norm_stderr\": 0.025215992877954202\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n\
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11659192825112108,\n\
\ \"acc_stderr\": 0.02153963981624447,\n \"acc_norm\": 0.11659192825112108,\n\
\ \"acc_norm_stderr\": 0.02153963981624447\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.20085470085470086,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n\
\ \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n\
\ \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044294,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044294\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.0252616912197295,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.0252616912197295\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n\
\ \"acc_stderr\": 0.010813585552659684,\n \"acc_norm\": 0.23402868318122555,\n\
\ \"acc_norm_stderr\": 0.010813585552659684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21568627450980393,\n \"acc_stderr\": 0.016639319350313264,\n \
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.016639319350313264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.032038410402133226,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.032038410402133226\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.031755547866299215,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.031755547866299215\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.032744852119469564,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.032744852119469564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474205,\n \"mc2\": 0.39509729118481285,\n\
\ \"mc2_stderr\": 0.014227368850578669\n }\n}\n```"
repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-30-04.204611.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-30-04.204611.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-30-04.204611.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_30_04.204611
path:
- results_2023-09-12T12-30-04.204611.parquet
- split: latest
path:
- results_2023-09-12T12-30-04.204611.parquet
---
# Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-step-50K-105b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-step-50K-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T12:30:04.204611](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-step-50K-105b/blob/main/results_2023-09-12T12-30-04.204611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2687179047639338,
"acc_stderr": 0.03195135753112674,
"acc_norm": 0.27059968140846513,
"acc_norm_stderr": 0.03196141859030811,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474205,
"mc2": 0.39509729118481285,
"mc2_stderr": 0.014227368850578669
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768682,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288682
},
"harness|hellaswag|10": {
"acc": 0.35391356303525195,
"acc_stderr": 0.004772054904404421,
"acc_norm": 0.4410476000796654,
"acc_norm_stderr": 0.004954977202585471
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.03435568056047874,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.03435568056047874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.02694748312149622,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.02694748312149622
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.03458816042181004,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.03458816042181004
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18487394957983194,
"acc_stderr": 0.025215992877954202,
"acc_norm": 0.18487394957983194,
"acc_norm_stderr": 0.025215992877954202
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.11659192825112108,
"acc_stderr": 0.02153963981624447,
"acc_norm": 0.11659192825112108,
"acc_norm_stderr": 0.02153963981624447
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044294,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044294
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0252616912197295,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0252616912197295
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659684,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.032038410402133226,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.032038410402133226
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299215,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299215
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474205,
"mc2": 0.39509729118481285,
"mc2_stderr": 0.014227368850578669
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Heitechsoft/goat-mpt | 2023-09-12T13:23:08.000Z | [
"region:us"
] | Heitechsoft | null | null | null | 0 | 0 | Entry not found |
lizhuang144/FACTUAL_Scene_Graph_ID | 2023-09-12T12:53:49.000Z | [
"region:us"
] | lizhuang144 | null | null | null | 0 | 0 | Please refer to https://github.com/zhuang-li/FACTUAL for a detailed description of this dataset. |
open-llm-leaderboard/details_Yehoon__yehoon_llama2 | 2023-09-12T12:53:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yehoon/yehoon_llama2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yehoon/yehoon_llama2](https://huggingface.co/Yehoon/yehoon_llama2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yehoon__yehoon_llama2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T12:52:12.986563](https://huggingface.co/datasets/open-llm-leaderboard/details_Yehoon__yehoon_llama2/blob/main/results_2023-09-12T12-52-12.986563.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.51451624752162,\n\
\ \"acc_stderr\": 0.03496295048918739,\n \"acc_norm\": 0.5182177314274695,\n\
\ \"acc_norm_stderr\": 0.0349480152345826,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.01648214881024147,\n \"mc2\": 0.491698763027883,\n\
\ \"mc2_stderr\": 0.015357177241665524\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5221843003412969,\n \"acc_stderr\": 0.014597001927076133,\n\
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59699263095001,\n \
\ \"acc_stderr\": 0.004894997736719051,\n \"acc_norm\": 0.7897829117705636,\n\
\ \"acc_norm_stderr\": 0.004066299761478503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.0235776047916558,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.0235776047916558\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.042407993275749255,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.042407993275749255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5516129032258065,\n \"acc_stderr\": 0.02829205683011273,\n \"\
acc_norm\": 0.5516129032258065,\n \"acc_norm_stderr\": 0.02829205683011273\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.0314102478056532,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.0314102478056532\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.032282103870378935,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955917,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955917\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.038890666191127236,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.038890666191127236\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196708,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196708\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n\
\ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n\
\ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5032679738562091,\n \"acc_stderr\": 0.020227402794434864,\n \
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.020227402794434864\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n\
\ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.01648214881024147,\n \"mc2\": 0.491698763027883,\n\
\ \"mc2_stderr\": 0.015357177241665524\n }\n}\n```"
repo_url: https://huggingface.co/Yehoon/yehoon_llama2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-52-12.986563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-52-12.986563.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-52-12.986563.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-52-12.986563.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_52_12.986563
path:
- results_2023-09-12T12-52-12.986563.parquet
- split: latest
path:
- results_2023-09-12T12-52-12.986563.parquet
---
# Dataset Card for Evaluation run of Yehoon/yehoon_llama2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yehoon/yehoon_llama2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yehoon/yehoon_llama2](https://huggingface.co/Yehoon/yehoon_llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yehoon__yehoon_llama2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T12:52:12.986563](https://huggingface.co/datasets/open-llm-leaderboard/details_Yehoon__yehoon_llama2/blob/main/results_2023-09-12T12-52-12.986563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.51451624752162,
"acc_stderr": 0.03496295048918739,
"acc_norm": 0.5182177314274695,
"acc_norm_stderr": 0.0349480152345826,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.01648214881024147,
"mc2": 0.491698763027883,
"mc2_stderr": 0.015357177241665524
},
"harness|arc:challenge|25": {
"acc": 0.5221843003412969,
"acc_stderr": 0.014597001927076133,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633827
},
"harness|hellaswag|10": {
"acc": 0.59699263095001,
"acc_stderr": 0.004894997736719051,
"acc_norm": 0.7897829117705636,
"acc_norm_stderr": 0.004066299761478503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.0235776047916558,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.0235776047916558
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.042407993275749255,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.042407993275749255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.02829205683011273,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.02829205683011273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.0314102478056532,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.0314102478056532
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.02533466708095495,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.02533466708095495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7045871559633028,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.7045871559633028,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378935,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955917,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955917
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.038890666191127236,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.038890666191127236
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196708,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196708
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668777,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668777
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.020227402794434864,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.020227402794434864
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5755102040816327,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.5755102040816327,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.01648214881024147,
"mc2": 0.491698763027883,
"mc2_stderr": 0.015357177241665524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_matsuo-lab__weblab-10b-instruction-sft | 2023-09-12T12:59:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 1 | 0 | ---
pretty_name: Evaluation run of matsuo-lab/weblab-10b-instruction-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [matsuo-lab/weblab-10b-instruction-sft](https://huggingface.co/matsuo-lab/weblab-10b-instruction-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_matsuo-lab__weblab-10b-instruction-sft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T12:58:31.709829](https://huggingface.co/datasets/open-llm-leaderboard/details_matsuo-lab__weblab-10b-instruction-sft/blob/main/results_2023-09-12T12-58-31.709829.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2720990798000811,\n\
\ \"acc_stderr\": 0.03210772125152178,\n \"acc_norm\": 0.27545716880364796,\n\
\ \"acc_norm_stderr\": 0.03210749837969005,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.36793137737872583,\n\
\ \"mc2_stderr\": 0.01402606563925028\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.36860068259385664,\n \"acc_stderr\": 0.014097810678042192,\n\
\ \"acc_norm\": 0.40102389078498296,\n \"acc_norm_stderr\": 0.014322255790719864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4872535351523601,\n\
\ \"acc_stderr\": 0.004988159744742522,\n \"acc_norm\": 0.652957578171679,\n\
\ \"acc_norm_stderr\": 0.004750565193992234\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.30943396226415093,\n \"acc_stderr\": 0.028450154794118627,\n\
\ \"acc_norm\": 0.30943396226415093,\n \"acc_norm_stderr\": 0.028450154794118627\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400192,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400192\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n\
\ \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700304,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700304\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26972477064220185,\n \"acc_stderr\": 0.019028486711115452,\n \"\
acc_norm\": 0.26972477064220185,\n \"acc_norm_stderr\": 0.019028486711115452\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.042943408452120954,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.042943408452120954\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3425925925925926,\n\
\ \"acc_stderr\": 0.045879047413018084,\n \"acc_norm\": 0.3425925925925926,\n\
\ \"acc_norm_stderr\": 0.045879047413018084\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n\
\ \"acc_stderr\": 0.03173393632969482,\n \"acc_norm\": 0.37606837606837606,\n\
\ \"acc_norm_stderr\": 0.03173393632969482\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770968,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770968\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.02378620325550828,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.02378620325550828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553967,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553967\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30864197530864196,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.30864197530864196,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2757496740547588,\n\
\ \"acc_stderr\": 0.011413813609161005,\n \"acc_norm\": 0.2757496740547588,\n\
\ \"acc_norm_stderr\": 0.011413813609161005\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.02423101337054109,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.02423101337054109\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28431372549019607,\n \"acc_stderr\": 0.018249024411207657,\n \
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.018249024411207657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.036310534964889056,\n\
\ \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.036310534964889056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.36793137737872583,\n\
\ \"mc2_stderr\": 0.01402606563925028\n }\n}\n```"
repo_url: https://huggingface.co/matsuo-lab/weblab-10b-instruction-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-58-31.709829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T12-58-31.709829.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-58-31.709829.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T12-58-31.709829.parquet'
- config_name: results
data_files:
- split: 2023_09_12T12_58_31.709829
path:
- results_2023-09-12T12-58-31.709829.parquet
- split: latest
path:
- results_2023-09-12T12-58-31.709829.parquet
---
# Dataset Card for Evaluation run of matsuo-lab/weblab-10b-instruction-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/matsuo-lab/weblab-10b-instruction-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [matsuo-lab/weblab-10b-instruction-sft](https://huggingface.co/matsuo-lab/weblab-10b-instruction-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_matsuo-lab__weblab-10b-instruction-sft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T12:58:31.709829](https://huggingface.co/datasets/open-llm-leaderboard/details_matsuo-lab__weblab-10b-instruction-sft/blob/main/results_2023-09-12T12-58-31.709829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2720990798000811,
"acc_stderr": 0.03210772125152178,
"acc_norm": 0.27545716880364796,
"acc_norm_stderr": 0.03210749837969005,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602587,
"mc2": 0.36793137737872583,
"mc2_stderr": 0.01402606563925028
},
"harness|arc:challenge|25": {
"acc": 0.36860068259385664,
"acc_stderr": 0.014097810678042192,
"acc_norm": 0.40102389078498296,
"acc_norm_stderr": 0.014322255790719864
},
"harness|hellaswag|10": {
"acc": 0.4872535351523601,
"acc_stderr": 0.004988159744742522,
"acc_norm": 0.652957578171679,
"acc_norm_stderr": 0.004750565193992234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30943396226415093,
"acc_stderr": 0.028450154794118627,
"acc_norm": 0.30943396226415093,
"acc_norm_stderr": 0.028450154794118627
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400192,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400192
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700304,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700304
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26972477064220185,
"acc_stderr": 0.019028486711115452,
"acc_norm": 0.26972477064220185,
"acc_norm_stderr": 0.019028486711115452
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.042943408452120954,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.042943408452120954
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.045879047413018084,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.045879047413018084
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.37606837606837606,
"acc_stderr": 0.03173393632969482,
"acc_norm": 0.37606837606837606,
"acc_norm_stderr": 0.03173393632969482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770968,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770968
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.02378620325550828,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.02378620325550828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553967,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553967
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30864197530864196,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.30864197530864196,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2757496740547588,
"acc_stderr": 0.011413813609161005,
"acc_norm": 0.2757496740547588,
"acc_norm_stderr": 0.011413813609161005
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.02423101337054109,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.02423101337054109
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.018249024411207657,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.018249024411207657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.036310534964889056,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.036310534964889056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602587,
"mc2": 0.36793137737872583,
"mc2_stderr": 0.01402606563925028
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
denmulia/Lora_s | 2023-09-12T13:24:45.000Z | [
"region:us"
] | denmulia | null | null | null | 0 | 0 | Entry not found |
hope04302/plantVillageDataset | 2023-09-12T13:27:15.000Z | [
"license:unknown",
"region:us"
] | hope04302 | null | null | null | 0 | 0 | ---
license: unknown
---
|
Tsabing/wedding-card-sm | 2023-09-12T13:48:31.000Z | [
"region:us"
] | Tsabing | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B | 2023-09-13T08:34:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of posicube/Llama2-chat-AYT-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [posicube/Llama2-chat-AYT-13B](https://huggingface.co/posicube/Llama2-chat-AYT-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T13:56:43.141895](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B/blob/main/results_2023-09-12T13-56-43.141895.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5975021165823359,\n\
\ \"acc_stderr\": 0.03386126196308724,\n \"acc_norm\": 0.6014044143845657,\n\
\ \"acc_norm_stderr\": 0.03383887034254738,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.5579867609825017,\n\
\ \"mc2_stderr\": 0.015698640912348946\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735562,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104296\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6366261700856403,\n\
\ \"acc_stderr\": 0.004799882248494813,\n \"acc_norm\": 0.835291774546903,\n\
\ \"acc_norm_stderr\": 0.003701589571274314\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724352,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.01743793717334323,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.01743793717334323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.014485656041669178,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.014485656041669178\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4793296089385475,\n\
\ \"acc_stderr\": 0.016708205559996137,\n \"acc_norm\": 0.4793296089385475,\n\
\ \"acc_norm_stderr\": 0.016708205559996137\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297236,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297236\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.0198984127176359,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.0198984127176359\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.5579867609825017,\n\
\ \"mc2_stderr\": 0.015698640912348946\n }\n}\n```"
repo_url: https://huggingface.co/posicube/Llama2-chat-AYT-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|arc:challenge|25_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hellaswag|10_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T13-56-43.141895.parquet'
- config_name: results
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- results_2023-09-12T13-56-43.141895.parquet
- split: latest
path:
- results_2023-09-12T13-56-43.141895.parquet
---
# Dataset Card for Evaluation run of posicube/Llama2-chat-AYT-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/posicube/Llama2-chat-AYT-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [posicube/Llama2-chat-AYT-13B](https://huggingface.co/posicube/Llama2-chat-AYT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T13:56:43.141895](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B/blob/main/results_2023-09-12T13-56-43.141895.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5975021165823359,
"acc_stderr": 0.03386126196308724,
"acc_norm": 0.6014044143845657,
"acc_norm_stderr": 0.03383887034254738,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.5579867609825017,
"mc2_stderr": 0.015698640912348946
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735562,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104296
},
"harness|hellaswag|10": {
"acc": 0.6366261700856403,
"acc_stderr": 0.004799882248494813,
"acc_norm": 0.835291774546903,
"acc_norm_stderr": 0.003701589571274314
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878948,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878948
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.01743793717334323,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.01743793717334323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669178,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669178
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4793296089385475,
"acc_stderr": 0.016708205559996137,
"acc_norm": 0.4793296089385475,
"acc_norm_stderr": 0.016708205559996137
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297236,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297236
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.0198984127176359,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.0198984127176359
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.5579867609825017,
"mc2_stderr": 0.015698640912348946
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
links-ads/hotspot-dataset | 2023-09-13T16:13:14.000Z | [
"arxiv:2308.02508",
"region:us"
] | links-ads | null | null | null | 0 | 0 | # Hotspot disambiguation dataset
This repository contains the dataset assembled as part of a work **A Multimodal Supervised Machine Learning Approach for Satellite-based Wildfire Identification in Europe**. Tha paper has been presented at the International Geoscience and Remote Sensing Symposium (**IGARSS**) 2023.
The full paper is available at https://arxiv.org/abs/2308.02508 .
The data folder contains two files:
- `dataset.csv`: this file contains the full cross-referenced dataset, obtained by conducing a temporal and spatial data intersection between the EFFIS burned areas and the MODIS/VIIRS hotspots.
- `dataset_500.csv`: this file contains a subset of the previous dataset (~500k data points), subsampled to obtain a dataset stratified with respect to the spatial distribution, and with a positive-negative proportion of 10%-90%. In addition to MODIS/VIIRS data points, additional columns have been added to improve the models' performances. This file is the one used to obtain the results showed in the paper.
## Code
The code and models used in this work are available at https://github.com/links-ads/hotspot-disambiguation .
## Contributions
- Angelica Urbanelli (angelica.urbanelli@linksfoundation.com)
- Luca Barco (luca.barco@linksfoundation.com)
- Edoardo Arnaudo (edoardo.arnaudo@polito.it | linksfoundation.com)
- Claudio Rossi (claudio.rossi@linksfoundation.com)
## BibTex
```
@inproceedings{urbanelli2023hotspot,
title={A Multimodal Supervised Machine Learning Approach for Satellite-based Wildfire Identification in Europe},
author={Urbanelli, Angelica and Barco, Luca and Arnaudo, Edoardo and Rossi, Claudio},
booktitle={2023 IEEE International Geoscience and Remote Sensing Symposium, IGARSS},
year={2023}
}
```
## Licence
cc-by-4.0
## Acknowledgments
This work was carried out in the context of two H2020 projects: SAFERS (GA n.869353) and OVERWATCH (GA n.101082320), and presented at IGARSS 2023.
|
deepHug/minigpt4_training_for_MMPretrain | 2023-09-13T07:48:26.000Z | [
"task_categories:text-retrieval",
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"language:zh",
"license:cc-by-nc-4.0",
"region:us"
] | deepHug | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
task_categories:
- text-retrieval
- conversational
language:
- en
- zh
size_categories:
- 1K<n<10K
---
Dataset for training MiniGPT4 from scratch in MMPretrain
---
More information and guide can be found in docs of [MMPretrain](https://mmpretrain.readthedocs.io/en/latest/).
license: cc-by-nc-4.0 |
KevinRivera/globalyData | 2023-09-12T14:22:12.000Z | [
"region:us"
] | KevinRivera | null | null | null | 0 | 0 | Entry not found |
imthanhlv/laion2B-multi-Vietnamese-subset | 2023-09-12T19:51:20.000Z | [
"task_categories:text-to-image",
"task_categories:image-to-text",
"language:vi",
"license:cc-by-4.0",
"region:us"
] | imthanhlv | null | null | null | 0 | 0 | ---
license: cc-by-4.0
task_categories:
- text-to-image
- image-to-text
language:
- vi
---
# Dataset Card for LAION-2B-multi Vietnamese subset
### Dataset Summary
Filter the Vietnamese subset from [Laion2B-multi](https://huggingface.co/datasets/laion/laion2B-multi)
To get the subset of your language, check out [this notebook](https://colab.research.google.com/drive/1bPvgFPKEIjzw7wT_9GwlDPvgTYDFdblr?usp=sharing) |
LBarroso22/trainGlobaly | 2023-09-12T14:31:42.000Z | [
"region:us"
] | LBarroso22 | null | null | null | 0 | 0 | Entry not found |
BAAI/Objaverse-MIX | 2023-10-11T00:59:42.000Z | [
"license:cc-by-4.0",
"region:us"
] | BAAI | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
This datasets contain two parts:
**occupancies**:
This part includes occupancies and point clouds. For specific usage, you can refer to [Occupancy Networks](https://github.com/autonomousvision/occupancy_networks).
**rendered_images**:
This part is a supplementary rendering dataset of [Objaverse dataset](https://huggingface.co/datasets/allenai/objaverse). The rendering code is sourced from [zero123](https://github.com/cvlab-columbia/zero123), with the difference being the use of the Eevee renderer, and the camera positions are fixed at 12 locations on a sphere with a radius of 2. They are numbered from 12 to 23, corresponding to:
- 12: Front view
- 13: Side view (left)
- 14: Top view
- 15: Back view
- 16: Side view (right)
- 17: Bottom view
- 18-20: Three equidistant points on the polar angle of 45°
- 21-23: Three equidistant points on the polar angle of 135°
|
malteee/TruckDet2 | 2023-09-12T14:59:02.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: area
sequence: float64
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
- name: id
sequence: int64
splits:
- name: train
num_bytes: 78780289.0
num_examples: 651
- name: test
num_bytes: 3798987.0
num_examples: 82
download_size: 82582528
dataset_size: 82579276.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "TruckDet2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit | 2023-09-12T14:55:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Enno-Ai/vigogne2-enno-13b-sft-lora-4bit](https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T14:53:48.356901](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit/blob/main/results_2023-09-12T14-53-48.356901.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5431742017602187,\n\
\ \"acc_stderr\": 0.03475364882242359,\n \"acc_norm\": 0.5472939238594319,\n\
\ \"acc_norm_stderr\": 0.03473161659618756,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42976038477184497,\n\
\ \"mc2_stderr\": 0.014287624194742454\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326021,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.625273849830711,\n\
\ \"acc_stderr\": 0.004830628620181032,\n \"acc_norm\": 0.8265285799641505,\n\
\ \"acc_norm_stderr\": 0.0037788044746059103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.03274287914026867,\n \"acc_norm\"\
: 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026867\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909878,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909878\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510186,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510186\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335844,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335844\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.027601921381417593,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.027601921381417593\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7279693486590039,\n\
\ \"acc_stderr\": 0.015913367447500514,\n \"acc_norm\": 0.7279693486590039,\n\
\ \"acc_norm_stderr\": 0.015913367447500514\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600646,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600646\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.02755994980234781,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.02755994980234781\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507884,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507884\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885994,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885994\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42976038477184497,\n\
\ \"mc2_stderr\": 0.014287624194742454\n }\n}\n```"
repo_url: https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-53-48.356901.parquet'
- config_name: results
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- results_2023-09-12T14-53-48.356901.parquet
- split: latest
path:
- results_2023-09-12T14-53-48.356901.parquet
---
# Dataset Card for Evaluation run of Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enno-Ai/vigogne2-enno-13b-sft-lora-4bit](https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T14:53:48.356901](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit/blob/main/results_2023-09-12T14-53-48.356901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5431742017602187,
"acc_stderr": 0.03475364882242359,
"acc_norm": 0.5472939238594319,
"acc_norm_stderr": 0.03473161659618756,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.42976038477184497,
"mc2_stderr": 0.014287624194742454
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326021,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.625273849830711,
"acc_stderr": 0.004830628620181032,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.0037788044746059103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026867,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026867
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909878,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909878
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510186,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510186
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335844,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335844
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417593,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417593
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7279693486590039,
"acc_stderr": 0.015913367447500514,
"acc_norm": 0.7279693486590039,
"acc_norm_stderr": 0.015913367447500514
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940924,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940924
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600646,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600646
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.02755994980234781,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.02755994980234781
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507884,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507884
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885994,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885994
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.42976038477184497,
"mc2_stderr": 0.014287624194742454
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Cloroform/dataset | 2023-09-12T15:08:30.000Z | [
"region:us"
] | Cloroform | null | null | null | 0 | 0 | card |
mindchain/demo_011 | 2023-09-12T15:11:32.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
juselara1/mlds7_restaurants | 2023-09-12T15:18:12.000Z | [
"region:us"
] | juselara1 | null | null | null | 0 | 0 | Entry not found |
lalphass/midjourney_prompt_dataset_created_with_bard | 2023-09-12T15:29:23.000Z | [
"license:apache-2.0",
"region:us"
] | lalphass | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
mindchain/demo_012 | 2023-09-12T15:30:22.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
babayo/babayo | 2023-09-12T15:37:07.000Z | [
"license:bigcode-openrail-m",
"region:us"
] | babayo | null | null | null | 0 | 0 | ---
license: bigcode-openrail-m
---
|
amirrezadnt/neww | 2023-09-12T15:47:27.000Z | [
"region:us"
] | amirrezadnt | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_13 | 2023-09-12T15:56:58.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
Xavier2213/Elviejomailo | 2023-09-12T16:04:27.000Z | [
"region:us"
] | Xavier2213 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Fredithefish__Guanaco-13B-Uncensored | 2023-09-12T16:11:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Fredithefish/Guanaco-13B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/Guanaco-13B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-13B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-13B-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T16:10:16.997512](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-13B-Uncensored/blob/main/results_2023-09-12T16-10-16.997512.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5384127689807373,\n\
\ \"acc_stderr\": 0.03451266823439898,\n \"acc_norm\": 0.5424705772302266,\n\
\ \"acc_norm_stderr\": 0.034491943523067406,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4326200610504711,\n\
\ \"mc2_stderr\": 0.014586686652767588\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.014497573881108287,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436175\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6208922525393348,\n\
\ \"acc_stderr\": 0.0048417344535066666,\n \"acc_norm\": 0.8270264887472615,\n\
\ \"acc_norm_stderr\": 0.003774513882615951\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789959,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789959\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6096774193548387,\n \"acc_stderr\": 0.02775125663696958,\n \"\
acc_norm\": 0.6096774193548387,\n \"acc_norm_stderr\": 0.02775125663696958\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770259,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770259\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004278,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004278\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402602,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4067796610169492,\n\
\ \"acc_stderr\": 0.012546325596569534,\n \"acc_norm\": 0.4067796610169492,\n\
\ \"acc_norm_stderr\": 0.012546325596569534\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159644,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159644\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5392156862745098,\n \"acc_stderr\": 0.020165523313907904,\n \
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4326200610504711,\n\
\ \"mc2_stderr\": 0.014586686652767588\n }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/Guanaco-13B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-10-16.997512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-10-16.997512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-10-16.997512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-10-16.997512.parquet'
- config_name: results
data_files:
- split: 2023_09_12T16_10_16.997512
path:
- results_2023-09-12T16-10-16.997512.parquet
- split: latest
path:
- results_2023-09-12T16-10-16.997512.parquet
---
# Dataset Card for Evaluation run of Fredithefish/Guanaco-13B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-13B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-13B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-13B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-13B-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T16:10:16.997512](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-13B-Uncensored/blob/main/results_2023-09-12T16-10-16.997512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5384127689807373,
"acc_stderr": 0.03451266823439898,
"acc_norm": 0.5424705772302266,
"acc_norm_stderr": 0.034491943523067406,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4326200610504711,
"mc2_stderr": 0.014586686652767588
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.014497573881108287,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436175
},
"harness|hellaswag|10": {
"acc": 0.6208922525393348,
"acc_stderr": 0.0048417344535066666,
"acc_norm": 0.8270264887472615,
"acc_norm_stderr": 0.003774513882615951
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789959,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789959
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.02775125663696958,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.02775125663696958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.01934203658770259,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.01934203658770259
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004278,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004278
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402602,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4067796610169492,
"acc_stderr": 0.012546325596569534,
"acc_norm": 0.4067796610169492,
"acc_norm_stderr": 0.012546325596569534
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159644,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4326200610504711,
"mc2_stderr": 0.014586686652767588
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shijli/glge | 2023-09-15T06:22:56.000Z | [
"region:us"
] | shijli | null | null | null | 1 | 0 | # A New General Language Generation Evaluation Benchmark
This original dataset and instruction can be found [here](https://github.com/microsoft/glge)
To create a dataset and with specific version, you can simply run:
```commandline
git clone https://huggingface.co/datasets/shijli/glge
cd glge/data
bash preprocess.sh dataset version model
```
by replacing dataset version and model with your own parameters.
For example
```commandline
bash preprocess.sh cnndm easy prophetnet_en
```
will create the easy version of cnndm data using the prophetnet_en vocabulary file. Please rename and move the
vocabulary file into the ./vocab directory.
|
open-llm-leaderboard/details_zarakiquemparte__zararp-l2-7b | 2023-09-12T16:18:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zararp-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zararp-l2-7b](https://huggingface.co/zarakiquemparte/zararp-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zararp-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T16:17:38.915453](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zararp-l2-7b/blob/main/results_2023-09-12T16-17-38.915453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5155499163975856,\n\
\ \"acc_stderr\": 0.0350436566389458,\n \"acc_norm\": 0.5191361298261467,\n\
\ \"acc_norm_stderr\": 0.03502838898542592,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.512599577390538,\n\
\ \"mc2_stderr\": 0.015390731196832515\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.01457558392201967,\n\
\ \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.01449442158425652\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6084445329615614,\n\
\ \"acc_stderr\": 0.00487100593940747,\n \"acc_norm\": 0.7918741286596296,\n\
\ \"acc_norm_stderr\": 0.004051376719498002\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467383,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467383\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.03097543638684544,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.03097543638684544\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n\
\ \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.708256880733945,\n \"acc_stderr\": 0.019489300968876515,\n \"\
acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.019489300968876515\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399813,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399813\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356462,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356462\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456607,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.02681771813034892,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.02681771813034892\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2849162011173184,\n\
\ \"acc_stderr\": 0.015096222302469797,\n \"acc_norm\": 0.2849162011173184,\n\
\ \"acc_norm_stderr\": 0.015096222302469797\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.028491993586171566,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.028491993586171566\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5401234567901234,\n \"acc_stderr\": 0.02773102275353928,\n\
\ \"acc_norm\": 0.5401234567901234,\n \"acc_norm_stderr\": 0.02773102275353928\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35919165580182527,\n\
\ \"acc_stderr\": 0.012253386187584248,\n \"acc_norm\": 0.35919165580182527,\n\
\ \"acc_norm_stderr\": 0.012253386187584248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5049019607843137,\n \"acc_stderr\": 0.02022686271003946,\n \
\ \"acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.02022686271003946\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.512599577390538,\n\
\ \"mc2_stderr\": 0.015390731196832515\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zararp-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-17-38.915453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-17-38.915453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-17-38.915453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-17-38.915453.parquet'
- config_name: results
data_files:
- split: 2023_09_12T16_17_38.915453
path:
- results_2023-09-12T16-17-38.915453.parquet
- split: latest
path:
- results_2023-09-12T16-17-38.915453.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zararp-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zararp-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zararp-l2-7b](https://huggingface.co/zarakiquemparte/zararp-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zararp-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T16:17:38.915453](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zararp-l2-7b/blob/main/results_2023-09-12T16-17-38.915453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5155499163975856,
"acc_stderr": 0.0350436566389458,
"acc_norm": 0.5191361298261467,
"acc_norm_stderr": 0.03502838898542592,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.512599577390538,
"mc2_stderr": 0.015390731196832515
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.01457558392201967,
"acc_norm": 0.5631399317406144,
"acc_norm_stderr": 0.01449442158425652
},
"harness|hellaswag|10": {
"acc": 0.6084445329615614,
"acc_stderr": 0.00487100593940747,
"acc_norm": 0.7918741286596296,
"acc_norm_stderr": 0.004051376719498002
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.03097543638684544,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.03097543638684544
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.708256880733945,
"acc_stderr": 0.019489300968876515,
"acc_norm": 0.708256880733945,
"acc_norm_stderr": 0.019489300968876515
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399813,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399813
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356462,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356462
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456607,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.02681771813034892,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.02681771813034892
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2849162011173184,
"acc_stderr": 0.015096222302469797,
"acc_norm": 0.2849162011173184,
"acc_norm_stderr": 0.015096222302469797
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.028491993586171566,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.028491993586171566
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5401234567901234,
"acc_stderr": 0.02773102275353928,
"acc_norm": 0.5401234567901234,
"acc_norm_stderr": 0.02773102275353928
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35919165580182527,
"acc_stderr": 0.012253386187584248,
"acc_norm": 0.35919165580182527,
"acc_norm_stderr": 0.012253386187584248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.02022686271003946,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.02022686271003946
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.512599577390538,
"mc2_stderr": 0.015390731196832515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
agemagician/doi-dataset-test | 2023-09-12T16:27:59.000Z | [
"license:cc-by-nc-sa-4.0",
"doi:10.57967/hf/1096",
"region:us"
] | agemagician | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test | 2023-09-12T16:42:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lu-vae/llama2-13b-sharegpt4-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lu-vae/llama2-13b-sharegpt4-test](https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T16:41:26.998548](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test/blob/main/results_2023-09-12T16-41-26.998548.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5607858055587415,\n\
\ \"acc_stderr\": 0.034461007147269726,\n \"acc_norm\": 0.564751512060342,\n\
\ \"acc_norm_stderr\": 0.034441093021876076,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.4827157244094245,\n\
\ \"mc2_stderr\": 0.015590156578735876\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633829,\n\
\ \"acc_norm\": 0.5802047781569966,\n \"acc_norm_stderr\": 0.014422181226303028\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6249751045608445,\n\
\ \"acc_stderr\": 0.004831399218500231,\n \"acc_norm\": 0.8265285799641505,\n\
\ \"acc_norm_stderr\": 0.0037788044746059103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.04062990784146667,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.04062990784146667\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n\
\ \"acc_stderr\": 0.04440521906179327,\n \"acc_norm\": 0.27450980392156865,\n\
\ \"acc_norm_stderr\": 0.04440521906179327\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n\
\ \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n\
\ \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022057,\n\
\ \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022057\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"\
acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803644,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954932,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891822,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891822\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7292464878671775,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.7292464878671775,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.02586220185227791,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.02586220185227791\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.016392221899407065,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.016392221899407065\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290282,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.01259674410899856,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.01259674410899856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014666,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014666\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.4827157244094245,\n\
\ \"mc2_stderr\": 0.015590156578735876\n }\n}\n```"
repo_url: https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-41-26.998548.parquet'
- config_name: results
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- results_2023-09-12T16-41-26.998548.parquet
- split: latest
path:
- results_2023-09-12T16-41-26.998548.parquet
---
# Dataset Card for Evaluation run of lu-vae/llama2-13b-sharegpt4-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lu-vae/llama2-13b-sharegpt4-test](https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T16:41:26.998548](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test/blob/main/results_2023-09-12T16-41-26.998548.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5607858055587415,
"acc_stderr": 0.034461007147269726,
"acc_norm": 0.564751512060342,
"acc_norm_stderr": 0.034441093021876076,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.4827157244094245,
"mc2_stderr": 0.015590156578735876
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633829,
"acc_norm": 0.5802047781569966,
"acc_norm_stderr": 0.014422181226303028
},
"harness|hellaswag|10": {
"acc": 0.6249751045608445,
"acc_stderr": 0.004831399218500231,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.0037788044746059103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.04062990784146667,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.04062990784146667
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803644,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954932,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891822,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891822
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7292464878671775,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.7292464878671775,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.02586220185227791,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.02586220185227791
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.016392221899407065,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.016392221899407065
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290282,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662737,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.01259674410899856,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.01259674410899856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014666,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014666
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.4827157244094245,
"mc2_stderr": 0.015590156578735876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Resizable/N3OON | 2023-09-12T16:58:29.000Z | [
"license:openrail",
"region:us"
] | Resizable | null | null | null | 0 | 0 | ---
license: openrail
---
|
Isora/Embeddings | 2023-09-12T17:06:30.000Z | [
"license:other",
"region:us"
] | Isora | null | null | null | 0 | 0 | ---
license: other
---
|
kdercksen/scireviewgen-csv | 2023-09-12T17:18:11.000Z | [
"region:us"
] | kdercksen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: reference
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1017206768
num_examples: 84705
- name: validation
num_bytes: 52660512
num_examples: 4410
- name: test
num_bytes: 54202617
num_examples: 4457
download_size: 507459864
dataset_size: 1124069897
---
# Dataset Card for "scireviewgen-csv"
This is a dataset built from the CSV export of the SciReviewGen dataset [found here](https://github.com/tetsu9923/SciReviewGen) (see "summarization_csv" for the original download link). |
pim2510/Oniel | 2023-09-12T17:25:13.000Z | [
"region:us"
] | pim2510 | null | null | null | 0 | 0 | Entry not found |
Resizable/NEON | 2023-09-12T17:34:28.000Z | [
"license:openrail",
"region:us"
] | Resizable | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2 | 2023-09-12T17:39:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CalderaAI/13B-Thorns-l2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CalderaAI/13B-Thorns-l2](https://huggingface.co/CalderaAI/13B-Thorns-l2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T17:37:55.153820](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2/blob/main/results_2023-09-12T17-37-55.153820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5710659962992051,\n\
\ \"acc_stderr\": 0.03440386468198291,\n \"acc_norm\": 0.5750519567174067,\n\
\ \"acc_norm_stderr\": 0.0343812990885105,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49519847145337487,\n\
\ \"mc2_stderr\": 0.01575768229677758\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.01433715891426844,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142815\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6329416450906195,\n\
\ \"acc_stderr\": 0.004810175357870938,\n \"acc_norm\": 0.8356901015733917,\n\
\ \"acc_norm_stderr\": 0.0036979923561244713\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149135,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"\
acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933886,\n\
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933886\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652247,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652247\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\
\ \"acc_stderr\": 0.01669315492738357,\n \"acc_norm\": 0.47039106145251397,\n\
\ \"acc_norm_stderr\": 0.01669315492738357\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787682,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.01986115519382916,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.01986115519382916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49519847145337487,\n\
\ \"mc2_stderr\": 0.01575768229677758\n }\n}\n```"
repo_url: https://huggingface.co/CalderaAI/13B-Thorns-l2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-37-55.153820.parquet'
- config_name: results
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- results_2023-09-12T17-37-55.153820.parquet
- split: latest
path:
- results_2023-09-12T17-37-55.153820.parquet
---
# Dataset Card for Evaluation run of CalderaAI/13B-Thorns-l2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CalderaAI/13B-Thorns-l2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CalderaAI/13B-Thorns-l2](https://huggingface.co/CalderaAI/13B-Thorns-l2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T17:37:55.153820](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2/blob/main/results_2023-09-12T17-37-55.153820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5710659962992051,
"acc_stderr": 0.03440386468198291,
"acc_norm": 0.5750519567174067,
"acc_norm_stderr": 0.0343812990885105,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.49519847145337487,
"mc2_stderr": 0.01575768229677758
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.01433715891426844,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142815
},
"harness|hellaswag|10": {
"acc": 0.6329416450906195,
"acc_stderr": 0.004810175357870938,
"acc_norm": 0.8356901015733917,
"acc_norm_stderr": 0.0036979923561244713
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149135,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652247,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652247
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47039106145251397,
"acc_stderr": 0.01669315492738357,
"acc_norm": 0.47039106145251397,
"acc_norm_stderr": 0.01669315492738357
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787682,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.01986115519382916,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.01986115519382916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.49519847145337487,
"mc2_stderr": 0.01575768229677758
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bipulparua/llm-lotr-test1 | 2023-09-13T04:54:20.000Z | [
"region:us"
] | bipulparua | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2196528.0
num_examples: 268
- name: test
num_bytes: 245880.0
num_examples: 30
download_size: 1128455
dataset_size: 2442408.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "llm-lotr-test1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WhiteAiZ/python | 2023-09-16T21:34:02.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | WhiteAiZ | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
MilanHrab/blablabla | 2023-09-12T17:51:43.000Z | [
"region:us"
] | MilanHrab | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_marcchew__Platypus-2-7B-LaMini-14K | 2023-09-12T17:53:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of marcchew/Platypus-2-7B-LaMini-14K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/Platypus-2-7B-LaMini-14K](https://huggingface.co/marcchew/Platypus-2-7B-LaMini-14K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__Platypus-2-7B-LaMini-14K\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T17:52:15.447894](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Platypus-2-7B-LaMini-14K/blob/main/results_2023-09-12T17-52-15.447894.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2317842745805016,\n\
\ \"acc_stderr\": 0.030720227059236,\n \"acc_norm\": 0.23290365774343916,\n\
\ \"acc_norm_stderr\": 0.030736403074046886,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.01497482727975234,\n \"mc2\": 0.4829418943957575,\n\
\ \"mc2_stderr\": 0.016542233586110552\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301842,\n\
\ \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25433180641306513,\n\
\ \"acc_stderr\": 0.004345949382382376,\n \"acc_norm\": 0.2615016928898626,\n\
\ \"acc_norm_stderr\": 0.0043855444871439145\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\
\ \"acc_stderr\": 0.01530238012354209,\n \"acc_norm\": 0.2413793103448276,\n\
\ \"acc_norm_stderr\": 0.01530238012354209\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.01497482727975234,\n\
\ \"mc2\": 0.4829418943957575,\n \"mc2_stderr\": 0.016542233586110552\n\
\ }\n}\n```"
repo_url: https://huggingface.co/marcchew/Platypus-2-7B-LaMini-14K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-52-15.447894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-52-15.447894.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-52-15.447894.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-52-15.447894.parquet'
- config_name: results
data_files:
- split: 2023_09_12T17_52_15.447894
path:
- results_2023-09-12T17-52-15.447894.parquet
- split: latest
path:
- results_2023-09-12T17-52-15.447894.parquet
---
# Dataset Card for Evaluation run of marcchew/Platypus-2-7B-LaMini-14K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/Platypus-2-7B-LaMini-14K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/Platypus-2-7B-LaMini-14K](https://huggingface.co/marcchew/Platypus-2-7B-LaMini-14K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__Platypus-2-7B-LaMini-14K",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T17:52:15.447894](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Platypus-2-7B-LaMini-14K/blob/main/results_2023-09-12T17-52-15.447894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2317842745805016,
"acc_stderr": 0.030720227059236,
"acc_norm": 0.23290365774343916,
"acc_norm_stderr": 0.030736403074046886,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975234,
"mc2": 0.4829418943957575,
"mc2_stderr": 0.016542233586110552
},
"harness|arc:challenge|25": {
"acc": 0.2363481228668942,
"acc_stderr": 0.012414960524301842,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.25433180641306513,
"acc_stderr": 0.004345949382382376,
"acc_norm": 0.2615016928898626,
"acc_norm_stderr": 0.0043855444871439145
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.01530238012354209,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.01530238012354209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975234,
"mc2": 0.4829418943957575,
"mc2_stderr": 0.016542233586110552
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BrookBvn/Zabuza | 2023-09-12T17:54:57.000Z | [
"region:us"
] | BrookBvn | null | null | null | 0 | 0 | Entry not found |
ttooppee/train | 2023-09-12T18:02:00.000Z | [
"region:us"
] | ttooppee | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B | 2023-09-12T18:06:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-Mix-L2-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-Mix-L2-20B](https://huggingface.co/Sao10K/Stheno-Mix-L2-20B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T18:05:15.025202](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B/blob/main/results_2023-09-12T18-05-15.025202.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5267628466456128,\n\
\ \"acc_stderr\": 0.03473009445228274,\n \"acc_norm\": 0.5305554357329201,\n\
\ \"acc_norm_stderr\": 0.03471355175067325,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5180391612093455,\n\
\ \"mc2_stderr\": 0.015751019412964078\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171869,\n\
\ \"acc_norm\": 0.5776450511945392,\n \"acc_norm_stderr\": 0.014434138713379981\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6040629356701852,\n\
\ \"acc_stderr\": 0.0048805154313231605,\n \"acc_norm\": 0.7962557259510058,\n\
\ \"acc_norm_stderr\": 0.004019578428155064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032488,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032488\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n\
\ \"acc_stderr\": 0.019416445892636032,\n \"acc_norm\": 0.7119266055045872,\n\
\ \"acc_norm_stderr\": 0.019416445892636032\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n\
\ \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004267,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004267\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7241379310344828,\n\
\ \"acc_stderr\": 0.015982814774695625,\n \"acc_norm\": 0.7241379310344828,\n\
\ \"acc_norm_stderr\": 0.015982814774695625\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.02653818910470548,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.02653818910470548\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095275,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095275\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825928,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825928\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854924,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302954,\n \
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302954\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5180391612093455,\n\
\ \"mc2_stderr\": 0.015751019412964078\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-Mix-L2-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-05-15.025202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-05-15.025202.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-05-15.025202.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-05-15.025202.parquet'
- config_name: results
data_files:
- split: 2023_09_12T18_05_15.025202
path:
- results_2023-09-12T18-05-15.025202.parquet
- split: latest
path:
- results_2023-09-12T18-05-15.025202.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-Mix-L2-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-Mix-L2-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Mix-L2-20B](https://huggingface.co/Sao10K/Stheno-Mix-L2-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T18:05:15.025202](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B/blob/main/results_2023-09-12T18-05-15.025202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5267628466456128,
"acc_stderr": 0.03473009445228274,
"acc_norm": 0.5305554357329201,
"acc_norm_stderr": 0.03471355175067325,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5180391612093455,
"mc2_stderr": 0.015751019412964078
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171869,
"acc_norm": 0.5776450511945392,
"acc_norm_stderr": 0.014434138713379981
},
"harness|hellaswag|10": {
"acc": 0.6040629356701852,
"acc_stderr": 0.0048805154313231605,
"acc_norm": 0.7962557259510058,
"acc_norm_stderr": 0.004019578428155064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032488,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032488
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636032,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636032
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004267,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004267
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.015982814774695625,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.015982814774695625
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.02653818910470548,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.02653818910470548
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095275,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095275
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825928,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825928
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854924,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302954,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302954
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5180391612093455,
"mc2_stderr": 0.015751019412964078
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ali-C137/Goud-Sum-Instruct-test-v0 | 2023-09-12T18:19:19.000Z | [
"region:us"
] | Ali-C137 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 22447355
num_examples: 9497
download_size: 10247768
dataset_size: 22447355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Goud-Sum-Instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tayamaken/Elza | 2023-09-12T18:31:54.000Z | [
"region:us"
] | tayamaken | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Sao10K__Stheno-1.3-L2-13B | 2023-09-12T18:40:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-1.3-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-1.3-L2-13B](https://huggingface.co/Sao10K/Stheno-1.3-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.3-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T18:38:55.194574](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.3-L2-13B/blob/main/results_2023-09-12T18-38-55.194574.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5294218250613762,\n\
\ \"acc_stderr\": 0.034699104656902666,\n \"acc_norm\": 0.5334472967463489,\n\
\ \"acc_norm_stderr\": 0.03468043669106848,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5023124365778059,\n\
\ \"mc2_stderr\": 0.015829473943426323\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.568259385665529,\n \"acc_norm_stderr\": 0.014474591427196202\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6170085640310695,\n\
\ \"acc_stderr\": 0.004851227527070901,\n \"acc_norm\": 0.8169687313284206,\n\
\ \"acc_norm_stderr\": 0.0038590186619619966\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.030635627957961816,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.030635627957961816\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.02320139293819498,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.02320139293819498\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.027666182075539645,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.027666182075539645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649038,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649038\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752964,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752964\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954932,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6954128440366972,\n \"acc_stderr\": 0.019732299420354045,\n \"\
acc_norm\": 0.6954128440366972,\n \"acc_norm_stderr\": 0.019732299420354045\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.032149521478027514,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.032149521478027514\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924336,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924336\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197765,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197765\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438888,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825928,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825928\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011994,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011994\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402605,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402605\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4041720990873533,\n\
\ \"acc_stderr\": 0.012533504046491365,\n \"acc_norm\": 0.4041720990873533,\n\
\ \"acc_norm_stderr\": 0.012533504046491365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5228758169934641,\n \"acc_stderr\": 0.02020665318788479,\n \"\
acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.02020665318788479\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5023124365778059,\n\
\ \"mc2_stderr\": 0.015829473943426323\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-1.3-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-38-55.194574.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-38-55.194574.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-38-55.194574.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-38-55.194574.parquet'
- config_name: results
data_files:
- split: 2023_09_12T18_38_55.194574
path:
- results_2023-09-12T18-38-55.194574.parquet
- split: latest
path:
- results_2023-09-12T18-38-55.194574.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-1.3-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-1.3-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.3-L2-13B](https://huggingface.co/Sao10K/Stheno-1.3-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.3-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T18:38:55.194574](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.3-L2-13B/blob/main/results_2023-09-12T18-38-55.194574.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5294218250613762,
"acc_stderr": 0.034699104656902666,
"acc_norm": 0.5334472967463489,
"acc_norm_stderr": 0.03468043669106848,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5023124365778059,
"mc2_stderr": 0.015829473943426323
},
"harness|arc:challenge|25": {
"acc": 0.5307167235494881,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.568259385665529,
"acc_norm_stderr": 0.014474591427196202
},
"harness|hellaswag|10": {
"acc": 0.6170085640310695,
"acc_stderr": 0.004851227527070901,
"acc_norm": 0.8169687313284206,
"acc_norm_stderr": 0.0038590186619619966
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.030635627957961816,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.030635627957961816
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.02320139293819498,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.02320139293819498
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539645,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649038,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649038
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752964,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752964
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954932,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.019732299420354045,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.019732299420354045
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.032149521478027514,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.032149521478027514
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924336,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924336
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197765,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438888,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825928,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825928
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011994,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011994
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4041720990873533,
"acc_stderr": 0.012533504046491365,
"acc_norm": 0.4041720990873533,
"acc_norm_stderr": 0.012533504046491365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.02020665318788479,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.02020665318788479
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5023124365778059,
"mc2_stderr": 0.015829473943426323
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Bebedi/Moneyboy | 2023-09-12T18:39:36.000Z | [
"region:us"
] | Bebedi | null | null | null | 0 | 0 | Entry not found |
CyberHarem/matsurika_pokemon | 2023-09-17T17:34:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsurika (Pokémon)
This is the dataset of matsurika (Pokémon), containing 146 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 146 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 364 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 146 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 146 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 146 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 146 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 146 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 364 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 364 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 364 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_jondurbin__spicyboros-7b-2.2 | 2023-09-12T18:49:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/spicyboros-7b-2.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/spicyboros-7b-2.2](https://huggingface.co/jondurbin/spicyboros-7b-2.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__spicyboros-7b-2.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T18:48:40.427009](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__spicyboros-7b-2.2/blob/main/results_2023-09-12T18-48-40.427009.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4875409141312441,\n\
\ \"acc_stderr\": 0.035190355165560176,\n \"acc_norm\": 0.4914653840964428,\n\
\ \"acc_norm_stderr\": 0.03517349774427833,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920623,\n \"mc2\": 0.4721753546589752,\n\
\ \"mc2_stderr\": 0.014800403085894951\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.01458063756999542,\n\
\ \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857359\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6026687910774746,\n\
\ \"acc_stderr\": 0.004883455188908965,\n \"acc_norm\": 0.8009360685122485,\n\
\ \"acc_norm_stderr\": 0.003984801854418771\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373058,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373058\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5419354838709678,\n\
\ \"acc_stderr\": 0.02834378725054061,\n \"acc_norm\": 0.5419354838709678,\n\
\ \"acc_norm_stderr\": 0.02834378725054061\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.037937131711656344,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.037937131711656344\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.032437180551374095,\n\
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.032437180551374095\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6752293577981652,\n \"acc_stderr\": 0.02007772910931033,\n \"\
acc_norm\": 0.6752293577981652,\n \"acc_norm_stderr\": 0.02007772910931033\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.0298180247497531,\n \"\
acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.0298180247497531\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.01684117465529572,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.01684117465529572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348923,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348923\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972708,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972708\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5123456790123457,\n \"acc_stderr\": 0.027812262269327245,\n\
\ \"acc_norm\": 0.5123456790123457,\n \"acc_norm_stderr\": 0.027812262269327245\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3650586701434159,\n\
\ \"acc_stderr\": 0.012296373743443478,\n \"acc_norm\": 0.3650586701434159,\n\
\ \"acc_norm_stderr\": 0.012296373743443478\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4591503267973856,\n \"acc_stderr\": 0.020160213617222516,\n \
\ \"acc_norm\": 0.4591503267973856,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748017,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748017\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920623,\n \"mc2\": 0.4721753546589752,\n\
\ \"mc2_stderr\": 0.014800403085894951\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/spicyboros-7b-2.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-48-40.427009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T18-48-40.427009.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-48-40.427009.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T18-48-40.427009.parquet'
- config_name: results
data_files:
- split: 2023_09_12T18_48_40.427009
path:
- results_2023-09-12T18-48-40.427009.parquet
- split: latest
path:
- results_2023-09-12T18-48-40.427009.parquet
---
# Dataset Card for Evaluation run of jondurbin/spicyboros-7b-2.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/spicyboros-7b-2.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/spicyboros-7b-2.2](https://huggingface.co/jondurbin/spicyboros-7b-2.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__spicyboros-7b-2.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T18:48:40.427009](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__spicyboros-7b-2.2/blob/main/results_2023-09-12T18-48-40.427009.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4875409141312441,
"acc_stderr": 0.035190355165560176,
"acc_norm": 0.4914653840964428,
"acc_norm_stderr": 0.03517349774427833,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920623,
"mc2": 0.4721753546589752,
"mc2_stderr": 0.014800403085894951
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.01458063756999542,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857359
},
"harness|hellaswag|10": {
"acc": 0.6026687910774746,
"acc_stderr": 0.004883455188908965,
"acc_norm": 0.8009360685122485,
"acc_norm_stderr": 0.003984801854418771
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373058,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373058
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5419354838709678,
"acc_stderr": 0.02834378725054061,
"acc_norm": 0.5419354838709678,
"acc_norm_stderr": 0.02834378725054061
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.037937131711656344,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.037937131711656344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.032437180551374095,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.032437180551374095
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6752293577981652,
"acc_stderr": 0.02007772910931033,
"acc_norm": 0.6752293577981652,
"acc_norm_stderr": 0.02007772910931033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.0298180247497531,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.0298180247497531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.01684117465529572,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.01684117465529572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348923,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348923
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972708,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972708
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5123456790123457,
"acc_stderr": 0.027812262269327245,
"acc_norm": 0.5123456790123457,
"acc_norm_stderr": 0.027812262269327245
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3650586701434159,
"acc_stderr": 0.012296373743443478,
"acc_norm": 0.3650586701434159,
"acc_norm_stderr": 0.012296373743443478
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4591503267973856,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.4591503267973856,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748017,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748017
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920623,
"mc2": 0.4721753546589752,
"mc2_stderr": 0.014800403085894951
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/sima_yi_reines_fgo | 2023-09-17T17:34:34.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sima_yi_reines (Fate/Grand Order)
This is the dataset of sima_yi_reines (Fate/Grand Order), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 470 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 470 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 470 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 470 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Admin0805/Newcc | 2023-09-12T19:14:03.000Z | [
"license:openrail",
"region:us"
] | Admin0805 | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/lychee_pokemon | 2023-09-17T17:34:36.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lychee (Pokémon)
This is the dataset of lychee (Pokémon), containing 93 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 93 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 247 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 93 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 93 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 93 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 93 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 93 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 247 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 247 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 247 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/burnet_pokemon | 2023-09-17T17:34:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of burnet (Pokémon)
This is the dataset of burnet (Pokémon), containing 26 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 26 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 70 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 26 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 26 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 26 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 26 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 26 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 70 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 70 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 70 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/anastasia_viy_fgo | 2023-09-17T17:34:40.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of anastasia_viy (Fate/Grand Order)
This is the dataset of anastasia_viy (Fate/Grand Order), containing 81 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 81 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 212 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 81 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 81 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 81 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 81 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 81 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 212 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 212 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 212 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat | 2023-09-12T19:57:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/Qwen-LLaMAfied-7B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Qwen-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T19:56:23.146408](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat/blob/main/results_2023-09-12T19-56-23.146408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5357741671495968,\n\
\ \"acc_stderr\": 0.034555420789771196,\n \"acc_norm\": 0.5398123752991899,\n\
\ \"acc_norm_stderr\": 0.03453766441838476,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4608515013805907,\n\
\ \"mc2_stderr\": 0.015086475930316952\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46501706484641636,\n \"acc_stderr\": 0.014575583922019667,\n\
\ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065003\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6408086038637721,\n\
\ \"acc_stderr\": 0.0047878291682556555,\n \"acc_norm\": 0.8346942840071699,\n\
\ \"acc_norm_stderr\": 0.0037069708564109647\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920945,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920945\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.03274287914026868,\n \"acc_norm\"\
: 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371215,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7339449541284404,\n \"acc_stderr\": 0.018946022322225604,\n \"\
acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.018946022322225604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.032867453125679603,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.032867453125679603\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041697,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041697\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7330779054916986,\n\
\ \"acc_stderr\": 0.01581845089477757,\n \"acc_norm\": 0.7330779054916986,\n\
\ \"acc_norm_stderr\": 0.01581845089477757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.02618966696627204,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.02618966696627204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n\
\ \"acc_stderr\": 0.014676252009319476,\n \"acc_norm\": 0.26033519553072626,\n\
\ \"acc_norm_stderr\": 0.014676252009319476\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363933,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363933\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037086,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037086\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n\
\ \"acc_stderr\": 0.012576779494860083,\n \"acc_norm\": 0.4132985658409387,\n\
\ \"acc_norm_stderr\": 0.012576779494860083\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5310457516339869,\n \"acc_stderr\": 0.020188804456361887,\n \
\ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.020188804456361887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4608515013805907,\n\
\ \"mc2_stderr\": 0.015086475930316952\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|arc:challenge|25_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hellaswag|10_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T19-56-23.146408.parquet'
- config_name: results
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- results_2023-09-12T19-56-23.146408.parquet
- split: latest
path:
- results_2023-09-12T19-56-23.146408.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Qwen-LLaMAfied-7B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Qwen-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T19:56:23.146408](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat/blob/main/results_2023-09-12T19-56-23.146408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5357741671495968,
"acc_stderr": 0.034555420789771196,
"acc_norm": 0.5398123752991899,
"acc_norm_stderr": 0.03453766441838476,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4608515013805907,
"mc2_stderr": 0.015086475930316952
},
"harness|arc:challenge|25": {
"acc": 0.46501706484641636,
"acc_stderr": 0.014575583922019667,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065003
},
"harness|hellaswag|10": {
"acc": 0.6408086038637721,
"acc_stderr": 0.0047878291682556555,
"acc_norm": 0.8346942840071699,
"acc_norm_stderr": 0.0037069708564109647
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920945,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920945
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371215,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.018946022322225604,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.018946022322225604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.032867453125679603,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.032867453125679603
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041697,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041697
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7330779054916986,
"acc_stderr": 0.01581845089477757,
"acc_norm": 0.7330779054916986,
"acc_norm_stderr": 0.01581845089477757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.02618966696627204,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.02618966696627204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26033519553072626,
"acc_stderr": 0.014676252009319476,
"acc_norm": 0.26033519553072626,
"acc_norm_stderr": 0.014676252009319476
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363933,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363933
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037086,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037086
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860083,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860083
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5310457516339869,
"acc_stderr": 0.020188804456361887,
"acc_norm": 0.5310457516339869,
"acc_norm_stderr": 0.020188804456361887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4608515013805907,
"mc2_stderr": 0.015086475930316952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/lila_pokemon | 2023-09-17T17:34:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lila (Pokémon)
This is the dataset of lila (Pokémon), containing 80 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 80 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 215 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 80 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 80 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 80 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 80 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 80 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 215 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 215 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 215 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BangumiBase/yahariorenoseishunlovecomewamachigatteiru | 2023-09-29T06:51:25.000Z | [
"size_categories:10K<n<100K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Yahari Ore No Seishun Lovecome Wa Machigatte Iru
This is the image base of bangumi Yahari Ore no Seishun LoveCome wa Machigatte Iru, we detected 73 characters, 10654 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1244 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 63 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 285 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 28 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 23 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 14 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 43 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 48 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 18 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 52 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 147 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 30 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 3021 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 228 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 85 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 137 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 44 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 22 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 122 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 27 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 23 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 107 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 45 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 41 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 13 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 43 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 23 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 29 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 18 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 8 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 81 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 30 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 28 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 31 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 73 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 32 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 31 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 27 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 106 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 18 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 12 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 28 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 26 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 32 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 81 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 1643 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 72 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 533 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 73 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 12 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 37 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 148 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 17 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 16 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 7 | [Download](56/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 57 | 317 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 143 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 193 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 15 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 25 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 168 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 23 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 13 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 18 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 10 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 13 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 81 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 29 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 19 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 31 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 295 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_chargoddard__llama-2-16b-nastychat | 2023-09-12T20:08:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/llama-2-16b-nastychat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/llama-2-16b-nastychat](https://huggingface.co/chargoddard/llama-2-16b-nastychat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama-2-16b-nastychat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T20:06:49.075564](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-16b-nastychat/blob/main/results_2023-09-12T20-06-49.075564.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5606885308722296,\n\
\ \"acc_stderr\": 0.03444686483502141,\n \"acc_norm\": 0.5642837191591707,\n\
\ \"acc_norm_stderr\": 0.03443012991144253,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720127,\n \"mc2\": 0.5344740020617761,\n\
\ \"mc2_stderr\": 0.01603631892494342\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344078,\n\
\ \"acc_norm\": 0.5742320819112628,\n \"acc_norm_stderr\": 0.014449464278868807\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6134236207926708,\n\
\ \"acc_stderr\": 0.0048596995624514555,\n \"acc_norm\": 0.8059151563433579,\n\
\ \"acc_norm_stderr\": 0.0039468624307729535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330877,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330877\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n \"acc_norm\"\
: 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245258,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245258\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493389,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493389\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176095,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176095\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534734,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890484,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890484\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654068,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654068\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.016337268694270105,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.016337268694270105\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\
\ \"acc_stderr\": 0.01256788267380368,\n \"acc_norm\": 0.41134289439374183,\n\
\ \"acc_norm_stderr\": 0.01256788267380368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02003639376835263,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02003639376835263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720127,\n \"mc2\": 0.5344740020617761,\n\
\ \"mc2_stderr\": 0.01603631892494342\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/llama-2-16b-nastychat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|arc:challenge|25_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hellaswag|10_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-06-49.075564.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-06-49.075564.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T20-06-49.075564.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T20-06-49.075564.parquet'
- config_name: results
data_files:
- split: 2023_09_12T20_06_49.075564
path:
- results_2023-09-12T20-06-49.075564.parquet
- split: latest
path:
- results_2023-09-12T20-06-49.075564.parquet
---
# Dataset Card for Evaluation run of chargoddard/llama-2-16b-nastychat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/llama-2-16b-nastychat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/llama-2-16b-nastychat](https://huggingface.co/chargoddard/llama-2-16b-nastychat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__llama-2-16b-nastychat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T20:06:49.075564](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-16b-nastychat/blob/main/results_2023-09-12T20-06-49.075564.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5606885308722296,
"acc_stderr": 0.03444686483502141,
"acc_norm": 0.5642837191591707,
"acc_norm_stderr": 0.03443012991144253,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720127,
"mc2": 0.5344740020617761,
"mc2_stderr": 0.01603631892494342
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344078,
"acc_norm": 0.5742320819112628,
"acc_norm_stderr": 0.014449464278868807
},
"harness|hellaswag|10": {
"acc": 0.6134236207926708,
"acc_stderr": 0.0048596995624514555,
"acc_norm": 0.8059151563433579,
"acc_norm_stderr": 0.0039468624307729535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245258,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493389,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493389
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176095,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176095
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890484,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890484
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654068,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654068
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270105,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.01256788267380368,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.01256788267380368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02003639376835263,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02003639376835263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720127,
"mc2": 0.5344740020617761,
"mc2_stderr": 0.01603631892494342
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/ogata_chieri_idolmastercinderellagirls | 2023-09-17T17:34:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ogata_chieri (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ogata_chieri (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 506 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 506 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 506 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 506 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
madrylab/celeba_224 | 2023-09-12T20:48:11.000Z | [
"region:us"
] | madrylab | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: float32
- name: 5_o_Clock_Shadow
dtype: int64
- name: Arched_Eyebrows
dtype: int64
- name: Attractive
dtype: int64
- name: Bags_Under_Eyes
dtype: int64
- name: Bald
dtype: int64
- name: Bangs
dtype: int64
- name: Big_Lips
dtype: int64
- name: Big_Nose
dtype: int64
- name: Black_Hair
dtype: int64
- name: Blond_Hair
dtype: int64
- name: Blurry
dtype: int64
- name: Brown_Hair
dtype: int64
- name: Bushy_Eyebrows
dtype: int64
- name: Chubby
dtype: int64
- name: Double_Chin
dtype: int64
- name: Eyeglasses
dtype: int64
- name: Goatee
dtype: int64
- name: Gray_Hair
dtype: int64
- name: Heavy_Makeup
dtype: int64
- name: High_Cheekbones
dtype: int64
- name: Male
dtype: int64
- name: Mouth_Slightly_Open
dtype: int64
- name: Mustache
dtype: int64
- name: Narrow_Eyes
dtype: int64
- name: No_Beard
dtype: int64
- name: Oval_Face
dtype: int64
- name: Pale_Skin
dtype: int64
- name: Pointy_Nose
dtype: int64
- name: Receding_Hairline
dtype: int64
- name: Rosy_Cheeks
dtype: int64
- name: Sideburns
dtype: int64
- name: Smiling
dtype: int64
- name: Straight_Hair
dtype: int64
- name: Wavy_Hair
dtype: int64
- name: Wearing_Earrings
dtype: int64
- name: Wearing_Hat
dtype: int64
- name: Wearing_Lipstick
dtype: int64
- name: Wearing_Necklace
dtype: int64
- name: Wearing_Necktie
dtype: int64
- name: Young
dtype: int64
splits:
- name: train
num_bytes: 98497986720
num_examples: 162770
- name: val
num_bytes: 12022236912
num_examples: 19867
- name: test
num_bytes: 12079724832
num_examples: 19962
download_size: 33558671968
dataset_size: 122599948464
---
# Dataset Card for "celeba_224"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MilanHrab/Kosice_training | 2023-09-12T20:47:10.000Z | [
"region:us"
] | MilanHrab | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: name_of_record
dtype: string
- name: speech_array
sequence: float64
- name: sampling_rate
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 1178840561.6
num_examples: 4480
download_size: 894629427
dataset_size: 1178840561.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Kosice_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MilanHrab/Kosice_test | 2023-09-12T20:47:46.000Z | [
"region:us"
] | MilanHrab | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: name_of_record
dtype: string
- name: speech_array
sequence: float64
- name: sampling_rate
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 294710140.4
num_examples: 1120
download_size: 223895398
dataset_size: 294710140.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Kosice_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ulewis/vaxclass | 2023-09-12T20:44:18.000Z | [
"license:mit",
"region:us"
] | ulewis | null | null | null | 0 | 0 | ---
license: mit
---
|
CyberHarem/carnet_pokemon | 2023-09-17T17:34:47.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of carnet (Pokémon)
This is the dataset of carnet (Pokémon), containing 99 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 99 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 258 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 99 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 99 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 99 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 99 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 99 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 258 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 258 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 258 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack | 2023-09-12T20:51:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/llama-2-26b-trenchcoat-stack](https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T20:50:33.844037](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack/blob/main/results_2023-09-12T20-50-33.844037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5372977653324258,\n\
\ \"acc_stderr\": 0.03440963421546945,\n \"acc_norm\": 0.5419168536211492,\n\
\ \"acc_norm_stderr\": 0.03439238826375593,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.40480080736740903,\n\
\ \"mc2_stderr\": 0.014006044450922432\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285015,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284732\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.566620195180243,\n\
\ \"acc_stderr\": 0.00494529127007243,\n \"acc_norm\": 0.799044015136427,\n\
\ \"acc_norm_stderr\": 0.00399896258097481\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724345,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724345\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n \"acc_norm\"\
: 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371217,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371217\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510193,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510193\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083292,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083292\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138608,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138608\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.02828632407556438,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.02828632407556438\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.015671006009339586,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.015671006009339586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553991,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553991\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271143,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.01256487154253435,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.01256487154253435\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.40480080736740903,\n\
\ \"mc2_stderr\": 0.014006044450922432\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|arc:challenge|25_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hellaswag|10_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-50-33.844037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T20-50-33.844037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T20-50-33.844037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T20-50-33.844037.parquet'
- config_name: results
data_files:
- split: 2023_09_12T20_50_33.844037
path:
- results_2023-09-12T20-50-33.844037.parquet
- split: latest
path:
- results_2023-09-12T20-50-33.844037.parquet
---
# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/llama-2-26b-trenchcoat-stack](https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T20:50:33.844037](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack/blob/main/results_2023-09-12T20-50-33.844037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5372977653324258,
"acc_stderr": 0.03440963421546945,
"acc_norm": 0.5419168536211492,
"acc_norm_stderr": 0.03439238826375593,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.40480080736740903,
"mc2_stderr": 0.014006044450922432
},
"harness|arc:challenge|25": {
"acc": 0.5102389078498294,
"acc_stderr": 0.014608326906285015,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284732
},
"harness|hellaswag|10": {
"acc": 0.566620195180243,
"acc_stderr": 0.00494529127007243,
"acc_norm": 0.799044015136427,
"acc_norm_stderr": 0.00399896258097481
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371217,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371217
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510193,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510193
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083292,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083292
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.02828632407556438,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.02828632407556438
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.015671006009339586,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.015671006009339586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553991,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553991
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271143,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370597,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370597
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.01256487154253435,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.01256487154253435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.40480080736740903,
"mc2_stderr": 0.014006044450922432
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DanielPortwine/styles | 2023-09-12T20:54:02.000Z | [
"region:us"
] | DanielPortwine | null | null | null | 0 | 0 | Entry not found |
fmagot01/gtzan_all_preprocessed | 2023-09-12T20:57:19.000Z | [
"region:us"
] | fmagot01 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': blues
'1': classical
'2': country
'3': disco
'4': hiphop
'5': jazz
'6': metal
'7': pop
'8': reggae
'9': rock
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3452159816
num_examples: 899
- name: test
num_bytes: 384000696
num_examples: 100
download_size: 1923103923
dataset_size: 3836160512
---
# Dataset Card for "gtzan_all_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged | 2023-09-12T21:29:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged](https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T21:28:35.383540](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged/blob/main/results_2023-09-12T21-28-35.383540.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.277018995463228,\n\
\ \"acc_stderr\": 0.03239747545557216,\n \"acc_norm\": 0.28094868444106624,\n\
\ \"acc_norm_stderr\": 0.032395131606659515,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123897,\n \"mc2\": 0.3477745324693538,\n\
\ \"mc2_stderr\": 0.013261749316970146\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35580204778157,\n \"acc_stderr\": 0.013990571137918762,\n\
\ \"acc_norm\": 0.40273037542662116,\n \"acc_norm_stderr\": 0.014332236306790145\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5310695080661223,\n\
\ \"acc_stderr\": 0.004980138679161043,\n \"acc_norm\": 0.7159928301135232,\n\
\ \"acc_norm_stderr\": 0.004500186424443805\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748143,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748143\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.024137632429337707,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.024137632429337707\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922988,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922988\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397155,\n \
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397155\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547815,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547815\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035293,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.3991031390134529,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083498,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n\
\ \"acc_stderr\": 0.016073127851221246,\n \"acc_norm\": 0.280970625798212,\n\
\ \"acc_norm_stderr\": 0.016073127851221246\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925312,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925312\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729494,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729494\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.025403832978179625,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.025403832978179625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2993827160493827,\n \"acc_stderr\": 0.025483115601195462,\n\
\ \"acc_norm\": 0.2993827160493827,\n \"acc_norm_stderr\": 0.025483115601195462\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.02764012054516993,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.02764012054516993\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294275,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294275\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3346938775510204,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.3346938775510204,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123897,\n \"mc2\": 0.3477745324693538,\n\
\ \"mc2_stderr\": 0.013261749316970146\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-28-35.383540.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-28-35.383540.parquet'
- config_name: results
data_files:
- split: 2023_09_12T21_28_35.383540
path:
- results_2023-09-12T21-28-35.383540.parquet
- split: latest
path:
- results_2023-09-12T21-28-35.383540.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged](https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T21:28:35.383540](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged/blob/main/results_2023-09-12T21-28-35.383540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.277018995463228,
"acc_stderr": 0.03239747545557216,
"acc_norm": 0.28094868444106624,
"acc_norm_stderr": 0.032395131606659515,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123897,
"mc2": 0.3477745324693538,
"mc2_stderr": 0.013261749316970146
},
"harness|arc:challenge|25": {
"acc": 0.35580204778157,
"acc_stderr": 0.013990571137918762,
"acc_norm": 0.40273037542662116,
"acc_norm_stderr": 0.014332236306790145
},
"harness|hellaswag|10": {
"acc": 0.5310695080661223,
"acc_stderr": 0.004980138679161043,
"acc_norm": 0.7159928301135232,
"acc_norm_stderr": 0.004500186424443805
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748143,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748143
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924318,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924318
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.024137632429337707,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.024137632429337707
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922988,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922988
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397155,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397155
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547815,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547815
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035293,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083498,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221246,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925312,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925312
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729494,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729494
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.025403832978179625,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.025403832978179625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2993827160493827,
"acc_stderr": 0.025483115601195462,
"acc_norm": 0.2993827160493827,
"acc_norm_stderr": 0.025483115601195462
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.02764012054516993,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.02764012054516993
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.01099615663514269,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.01099615663514269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294275,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294275
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3346938775510204,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.3346938775510204,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123897,
"mc2": 0.3477745324693538,
"mc2_stderr": 0.013261749316970146
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
stonet2000/plugcharger-rl-demos | 2023-09-13T03:07:06.000Z | [
"region:us"
] | stonet2000 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_grimpep__MythoMax-L2-33b | 2023-09-12T21:47:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of grimpep/MythoMax-L2-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimpep/MythoMax-L2-33b](https://huggingface.co/grimpep/MythoMax-L2-33b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimpep__MythoMax-L2-33b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T21:46:34.528264](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__MythoMax-L2-33b/blob/main/results_2023-09-12T21-46-34.528264.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.510197997446698,\n\
\ \"acc_stderr\": 0.034698207813013165,\n \"acc_norm\": 0.5143638265847267,\n\
\ \"acc_norm_stderr\": 0.03468158572078421,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.524808738389582,\n\
\ \"mc2_stderr\": 0.015873078551875083\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636583,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650649\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5786695877315275,\n\
\ \"acc_stderr\": 0.004927631806477559,\n \"acc_norm\": 0.7911770563632743,\n\
\ \"acc_norm_stderr\": 0.0040563690969549395\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923183,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923183\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n\
\ \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n\
\ \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6313131313131313,\n \"acc_stderr\": 0.034373055019806184,\n \"\
acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.034373055019806184\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.03119584087770029,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.03119584087770029\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6697247706422018,\n\
\ \"acc_stderr\": 0.020164466336342977,\n \"acc_norm\": 0.6697247706422018,\n\
\ \"acc_norm_stderr\": 0.020164466336342977\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.0299366963871386,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.0299366963871386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6934865900383141,\n\
\ \"acc_stderr\": 0.016486952893041508,\n \"acc_norm\": 0.6934865900383141,\n\
\ \"acc_norm_stderr\": 0.016486952893041508\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.014125968754673384,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.014125968754673384\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704732,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704732\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n\
\ \"acc_stderr\": 0.012463861839982064,\n \"acc_norm\": 0.39113428943937417,\n\
\ \"acc_norm_stderr\": 0.012463861839982064\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600244,\n \
\ \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600244\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.524808738389582,\n\
\ \"mc2_stderr\": 0.015873078551875083\n }\n}\n```"
repo_url: https://huggingface.co/grimpep/MythoMax-L2-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-46-34.528264.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-46-34.528264.parquet'
- config_name: results
data_files:
- split: 2023_09_12T21_46_34.528264
path:
- results_2023-09-12T21-46-34.528264.parquet
- split: latest
path:
- results_2023-09-12T21-46-34.528264.parquet
---
# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/grimpep/MythoMax-L2-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [grimpep/MythoMax-L2-33b](https://huggingface.co/grimpep/MythoMax-L2-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimpep__MythoMax-L2-33b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T21:46:34.528264](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__MythoMax-L2-33b/blob/main/results_2023-09-12T21-46-34.528264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.510197997446698,
"acc_stderr": 0.034698207813013165,
"acc_norm": 0.5143638265847267,
"acc_norm_stderr": 0.03468158572078421,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.524808738389582,
"mc2_stderr": 0.015873078551875083
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636583,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650649
},
"harness|hellaswag|10": {
"acc": 0.5786695877315275,
"acc_stderr": 0.004927631806477559,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.0040563690969549395
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923183,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923183
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6313131313131313,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.6313131313131313,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.03119584087770029,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.03119584087770029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342977,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.0299366963871386,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.0299366963871386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6934865900383141,
"acc_stderr": 0.016486952893041508,
"acc_norm": 0.6934865900383141,
"acc_norm_stderr": 0.016486952893041508
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.014125968754673384,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.014125968754673384
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662737,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704732,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704732
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.012463861839982064,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.012463861839982064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.020209572388600244,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.020209572388600244
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.524808738389582,
"mc2_stderr": 0.015873078551875083
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nikchar/paper_test_assym_roberta_results | 2023-09-12T21:49:26.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 73601741
num_examples: 11073
download_size: 34426502
dataset_size: 73601741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "paper_test_assym_roberta_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/aether_foundation_employee_pokemon | 2023-09-17T17:34:49.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aether_foundation_employee (Pokémon)
This is the dataset of aether_foundation_employee (Pokémon), containing 148 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 148 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 409 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 148 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 148 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 148 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 148 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 148 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 409 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 409 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 409 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sakuma_mayu_idolmastercinderellagirls | 2023-09-17T17:34:51.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sakuma_mayu (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sakuma_mayu (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 494 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 494 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 494 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 494 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sumomo_pokemon | 2023-09-17T17:34:53.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sumomo (Pokémon)
This is the dataset of sumomo (Pokémon), containing 73 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 73 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 186 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 73 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 73 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 73 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 73 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 73 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 186 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 186 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 186 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_bongchoi__test-llama-2-7b | 2023-09-12T23:03:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bongchoi/test-llama-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bongchoi/test-llama-2-7b](https://huggingface.co/bongchoi/test-llama-2-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__test-llama-2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T23:02:34.518107](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama-2-7b/blob/main/results_2023-09-12T23-02-34.518107.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.471008753299703,\n\
\ \"acc_stderr\": 0.03528088196519964,\n \"acc_norm\": 0.4749886536723232,\n\
\ \"acc_norm_stderr\": 0.035266604173246285,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\
\ \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187899,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187899\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"\
acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5343137254901961,\n \"acc_stderr\": 0.03501038327635897,\n \"\
acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.03501038327635897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n\
\ \"acc_stderr\": 0.01227751253325248,\n \"acc_norm\": 0.3624511082138201,\n\
\ \"acc_norm_stderr\": 0.01227751253325248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n }\n}\n```"
repo_url: https://huggingface.co/bongchoi/test-llama-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|arc:challenge|25_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hellaswag|10_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T23-02-34.518107.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T23-02-34.518107.parquet'
- config_name: results
data_files:
- split: 2023_09_12T23_02_34.518107
path:
- results_2023-09-12T23-02-34.518107.parquet
- split: latest
path:
- results_2023-09-12T23-02-34.518107.parquet
---
# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bongchoi/test-llama-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bongchoi/test-llama-2-7b](https://huggingface.co/bongchoi/test-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bongchoi__test-llama-2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T23:02:34.518107](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama-2-7b/blob/main/results_2023-09-12T23-02-34.518107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.471008753299703,
"acc_stderr": 0.03528088196519964,
"acc_norm": 0.4749886536723232,
"acc_norm_stderr": 0.035266604173246285,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187899,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187899
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.01227751253325248,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.01227751253325248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shelvin94/Darija | 2023-09-12T23:17:15.000Z | [
"language:ar",
"region:us"
] | shelvin94 | null | null | null | 0 | 0 | ---
language:
- ar
--- |
sadsadsadddd/asasasa | 2023-09-12T23:41:35.000Z | [
"license:openrail",
"region:us"
] | sadsadsadddd | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/suiren_s_mother_pokemon | 2023-09-17T17:34:55.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of suiren_s_mother (Pokémon)
This is the dataset of suiren_s_mother (Pokémon), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 528 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 528 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 528 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 528 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
AmelieSchreiber/binding_sites_random_split_by_family_550K | 2023-09-13T19:39:56.000Z | [
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"biology",
"protein sequences",
"binding sites",
"active sites",
"region:us"
] | AmelieSchreiber | null | null | null | 1 | 0 | ---
license: mit
language:
- en
tags:
- biology
- protein sequences
- binding sites
- active sites
size_categories:
- 100K<n<1M
---
This dataset is obtained from a [UniProt search](https://www.uniprot.org/uniprotkb?facets=proteins_with%3A9%2Cannotation_score%3A4&fields=accession%2Cprotein_families%2Cft_binding%2Cft_act_site%2Csequence%2Ccc_similarity&query=%28ft_binding%3A*%29+AND+%28family%3A*%29&view=table)
for protein sequences with family and binding site annotations. The dataset includes unreviewed (TrEMBL) protein sequences as well as
reviewed sequences. We refined the dataset by only including sequences with an annotation score of 4. We sorted and split by family, where
random families were selected for the test dataset until approximately 20% of the protein sequences were separated out for test data.
We excluded any sequences with `<`, `>`, or `?` in the binding site annotations. We furthermore included any active sites that were not
listed as binding sites in the labels (seen in the merged "Binding-Active Sites" column). We split any sequence longer than 1000 residues
into non-overlapping sections of 1000 amino acids or less after the train test split. This results in subsequences of the original protein
sequence that may be too short for consideration, and filtration of the dataset to exclude such subsequences or segment the longer sequences
in a more intelligent way may improve performance. Pickle files containing only the train/test sequences and their binary labels are also available
and can be downloaded for training or validation of the train/test metrics. |
ForestCabo/your-dataset-name | 2023-09-13T00:10:24.000Z | [
"region:us"
] | ForestCabo | null | null | null | 0 | 0 | Entry not found |
DylanJHJ/temp | 2023-10-03T01:22:52.000Z | [
"license:apache-2.0",
"region:us"
] | DylanJHJ | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CyberHarem/totoki_airi_idolmastercinderellagirls | 2023-09-17T17:34:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of totoki_airi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of totoki_airi (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 533 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 533 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 533 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 533 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/team_skull_underling_pokemon | 2023-09-17T17:34:59.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of team_skull_underling (Pokémon)
This is the dataset of team_skull_underling (Pokémon), containing 62 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 62 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 146 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 62 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 62 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 62 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 62 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 62 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 146 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 146 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 146 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ | 2023-09-13T00:36:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/manticore-13b-chat-pyg-GPTQ](https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T00:35:05.075823](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ/blob/main/results_2023-09-13T00-35-05.075823.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47910157210911797,\n\
\ \"acc_stderr\": 0.035136312942949756,\n \"acc_norm\": 0.4829754447260613,\n\
\ \"acc_norm_stderr\": 0.035118284304147665,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.4776861669816965,\n\
\ \"mc2_stderr\": 0.014996477492223563\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097662,\n\
\ \"acc_norm\": 0.5784982935153583,\n \"acc_norm_stderr\": 0.014430197069326025\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6094403505277833,\n\
\ \"acc_stderr\": 0.004868787333436583,\n \"acc_norm\": 0.8106950806612229,\n\
\ \"acc_norm_stderr\": 0.0039095001598848985\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929774,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929774\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.49032258064516127,\n \"acc_stderr\": 0.028438677998909565,\n \"\
acc_norm\": 0.49032258064516127,\n \"acc_norm_stderr\": 0.028438677998909565\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5404040404040404,\n\
\ \"acc_stderr\": 0.03550702465131343,\n \"acc_norm\": 0.5404040404040404,\n\
\ \"acc_norm_stderr\": 0.03550702465131343\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.02515826601686857,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.02515826601686857\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.032339434681820885,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.032339434681820885\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6018348623853211,\n \"acc_stderr\": 0.02098798942265427,\n \"\
acc_norm\": 0.6018348623853211,\n \"acc_norm_stderr\": 0.02098798942265427\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.03343577705583065,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.03343577705583065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541187,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541187\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6590038314176245,\n\
\ \"acc_stderr\": 0.016951781383223313,\n \"acc_norm\": 0.6590038314176245,\n\
\ \"acc_norm_stderr\": 0.016951781383223313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197604,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777804,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777804\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n\
\ \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.3970013037809648,\n\
\ \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48856209150326796,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.48856209150326796,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.4776861669816965,\n\
\ \"mc2_stderr\": 0.014996477492223563\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|arc:challenge|25_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hellaswag|10_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-35-05.075823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-35-05.075823.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T00-35-05.075823.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T00-35-05.075823.parquet'
- config_name: results
data_files:
- split: 2023_09_13T00_35_05.075823
path:
- results_2023-09-13T00-35-05.075823.parquet
- split: latest
path:
- results_2023-09-13T00-35-05.075823.parquet
---
# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/manticore-13b-chat-pyg-GPTQ](https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T00:35:05.075823](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ/blob/main/results_2023-09-13T00-35-05.075823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47910157210911797,
"acc_stderr": 0.035136312942949756,
"acc_norm": 0.4829754447260613,
"acc_norm_stderr": 0.035118284304147665,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.4776861669816965,
"mc2_stderr": 0.014996477492223563
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097662,
"acc_norm": 0.5784982935153583,
"acc_norm_stderr": 0.014430197069326025
},
"harness|hellaswag|10": {
"acc": 0.6094403505277833,
"acc_stderr": 0.004868787333436583,
"acc_norm": 0.8106950806612229,
"acc_norm_stderr": 0.0039095001598848985
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929774,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929774
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194978,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909565,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909565
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.03550702465131343,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.03550702465131343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.02515826601686857,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.02515826601686857
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6018348623853211,
"acc_stderr": 0.02098798942265427,
"acc_norm": 0.6018348623853211,
"acc_norm_stderr": 0.02098798942265427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.03343577705583065,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.03343577705583065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541187,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541187
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6590038314176245,
"acc_stderr": 0.016951781383223313,
"acc_norm": 0.6590038314176245,
"acc_norm_stderr": 0.016951781383223313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197604,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777804,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777804
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48856209150326796,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.48856209150326796,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.4776861669816965,
"mc2_stderr": 0.014996477492223563
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/mai_pokemon | 2023-09-17T17:35:01.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mai (Pokémon)
This is the dataset of mai (Pokémon), containing 69 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 69 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 179 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 69 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 69 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 69 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 69 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 69 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 179 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 179 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 179 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/ruri_pokemon | 2023-09-17T17:35:03.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ruri (Pokémon)
This is the dataset of ruri (Pokémon), containing 26 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 26 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 69 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 26 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 26 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 26 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 26 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 26 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 69 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 69 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 69 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sid-li/test | 2023-09-13T01:16:49.000Z | [
"region:us"
] | sid-li | null | null | null | 0 | 0 | Entry not found |
Hachiki/asmr-mami | 2023-09-13T01:22:56.000Z | [
"region:us"
] | Hachiki | null | null | null | 0 | 0 | Entry not found |
CyberHarem/melissa_pokemon | 2023-09-17T17:35:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of melissa (Pokémon)
This is the dataset of melissa (Pokémon), containing 86 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 86 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 219 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 86 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 86 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 86 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 86 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 86 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 219 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 219 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 219 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
approach0/MATH_and_PRM | 2023-09-13T01:47:13.000Z | [
"region:us"
] | approach0 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: src_path
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15325348.0
num_examples: 13665
- name: test
num_bytes: 8685910.0
num_examples: 8076
download_size: 9782004
dataset_size: 24011258.0
---
# Dataset Card for "MATH_and_PRM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ran_pokemon | 2023-09-17T17:35:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ran (Pokémon)
This is the dataset of ran (Pokémon), containing 11 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 11 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 29 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 11 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 11 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 11 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 11 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 11 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 29 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 29 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 29 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_teknium__OpenHermes-13B | 2023-09-13T02:07:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of teknium/OpenHermes-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T02:06:09.559271](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-09-13T02-06-09.559271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5630411968474885,\n\
\ \"acc_stderr\": 0.034493899434396784,\n \"acc_norm\": 0.567014345914575,\n\
\ \"acc_norm_stderr\": 0.03447360969744153,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.459815379294228,\n\
\ \"mc2_stderr\": 0.015281682974346678\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735567\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6249751045608445,\n\
\ \"acc_stderr\": 0.0048313992185002345,\n \"acc_norm\": 0.8218482374029078,\n\
\ \"acc_norm_stderr\": 0.003818584384635532\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033583,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033583\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300642,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300642\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.02531764972644866,\n \
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.02531764972644866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257374,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257374\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209807,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209807\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395951,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395951\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.016547887997416105,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.016547887997416105\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507898,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507898\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.01265000799946388,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.01265000799946388\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159645,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159645\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795198,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.459815379294228,\n\
\ \"mc2_stderr\": 0.015281682974346678\n }\n}\n```"
repo_url: https://huggingface.co/teknium/OpenHermes-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|arc:challenge|25_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hellaswag|10_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet'
- config_name: results
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- results_2023-09-13T01-56-57.835904.parquet
- split: 2023_09_13T02_06_09.559271
path:
- results_2023-09-13T02-06-09.559271.parquet
- split: latest
path:
- results_2023-09-13T02-06-09.559271.parquet
---
# Dataset Card for Evaluation run of teknium/OpenHermes-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T02:06:09.559271](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-09-13T02-06-09.559271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5630411968474885,
"acc_stderr": 0.034493899434396784,
"acc_norm": 0.567014345914575,
"acc_norm_stderr": 0.03447360969744153,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.459815379294228,
"mc2_stderr": 0.015281682974346678
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735567
},
"harness|hellaswag|10": {
"acc": 0.6249751045608445,
"acc_stderr": 0.0048313992185002345,
"acc_norm": 0.8218482374029078,
"acc_norm_stderr": 0.003818584384635532
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033583,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033583
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.02531764972644866,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.02531764972644866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257374,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257374
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209807,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209807
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395951,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395951
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089581,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089581
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.016547887997416105,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.016547887997416105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507898,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507898
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.01265000799946388,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.01265000799946388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159645,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159645
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795198,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.459815379294228,
"mc2_stderr": 0.015281682974346678
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.