id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
dashondash/chinlora | 2023-09-11T17:16:44.000Z | [
"region:us"
] | dashondash | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Writer__palmyra-med-20b | 2023-09-12T21:54:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Writer/palmyra-med-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/palmyra-med-20b](https://huggingface.co/Writer/palmyra-med-20b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-med-20b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T21:53:25.718910](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-med-20b/blob/main/results_2023-09-12T21-53-25.718910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.445324610748968,\n\
\ \"acc_stderr\": 0.03532955676849744,\n \"acc_norm\": 0.44895877457621725,\n\
\ \"acc_norm_stderr\": 0.03532172217737332,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487288,\n \"mc2\": 0.3553221305957241,\n\
\ \"mc2_stderr\": 0.014174982761442424\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.014484703048857364,\n\
\ \"acc_norm\": 0.46757679180887374,\n \"acc_norm_stderr\": 0.01458063756999542\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5542720573590918,\n\
\ \"acc_stderr\": 0.004960299952519407,\n \"acc_norm\": 0.7354112726548496,\n\
\ \"acc_norm_stderr\": 0.004402124555058386\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556552,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556552\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325635,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325635\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\
\ \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.5258064516129032,\n\
\ \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.03210494433751458,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.03210494433751458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5699481865284974,\n \"acc_stderr\": 0.035729543331448094,\n\
\ \"acc_norm\": 0.5699481865284974,\n \"acc_norm_stderr\": 0.035729543331448094\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6091743119266055,\n \"acc_stderr\": 0.020920058346111055,\n \"\
acc_norm\": 0.6091743119266055,\n \"acc_norm_stderr\": 0.020920058346111055\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524866,\n \"\
acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524866\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.569620253164557,\n \"acc_stderr\": 0.032230171959375976,\n \
\ \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.032230171959375976\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n\
\ \"acc_stderr\": 0.033530461674123005,\n \"acc_norm\": 0.5201793721973094,\n\
\ \"acc_norm_stderr\": 0.033530461674123005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.49586776859504134,\n \"acc_stderr\": 0.045641987674327526,\n \"\
acc_norm\": 0.49586776859504134,\n \"acc_norm_stderr\": 0.045641987674327526\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456026,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456026\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5854700854700855,\n\
\ \"acc_stderr\": 0.03227396567623779,\n \"acc_norm\": 0.5854700854700855,\n\
\ \"acc_norm_stderr\": 0.03227396567623779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5900383141762452,\n\
\ \"acc_stderr\": 0.017587672312336048,\n \"acc_norm\": 0.5900383141762452,\n\
\ \"acc_norm_stderr\": 0.017587672312336048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4533762057877814,\n\
\ \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.4533762057877814,\n\
\ \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327242,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327242\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32978723404255317,\n \"acc_stderr\": 0.02804594694204239,\n \
\ \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.02804594694204239\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35723598435462844,\n\
\ \"acc_stderr\": 0.012238615750316505,\n \"acc_norm\": 0.35723598435462844,\n\
\ \"acc_norm_stderr\": 0.012238615750316505\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.02007942040808792,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.02007942040808792\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n\
\ \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.5970149253731343,\n\
\ \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487288,\n \"mc2\": 0.3553221305957241,\n\
\ \"mc2_stderr\": 0.014174982761442424\n }\n}\n```"
repo_url: https://huggingface.co/Writer/palmyra-med-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-21-21.677448.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-53-25.718910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-53-25.718910.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-21-21.677448.parquet'
- split: 2023_09_12T21_53_25.718910
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-53-25.718910.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T21-53-25.718910.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_21_21.677448
path:
- results_2023-09-11T17-21-21.677448.parquet
- split: 2023_09_12T21_53_25.718910
path:
- results_2023-09-12T21-53-25.718910.parquet
- split: latest
path:
- results_2023-09-12T21-53-25.718910.parquet
---
# Dataset Card for Evaluation run of Writer/palmyra-med-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/palmyra-med-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/palmyra-med-20b](https://huggingface.co/Writer/palmyra-med-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-med-20b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T21:53:25.718910](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-med-20b/blob/main/results_2023-09-12T21-53-25.718910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.445324610748968,
"acc_stderr": 0.03532955676849744,
"acc_norm": 0.44895877457621725,
"acc_norm_stderr": 0.03532172217737332,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487288,
"mc2": 0.3553221305957241,
"mc2_stderr": 0.014174982761442424
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.014484703048857364,
"acc_norm": 0.46757679180887374,
"acc_norm_stderr": 0.01458063756999542
},
"harness|hellaswag|10": {
"acc": 0.5542720573590918,
"acc_stderr": 0.004960299952519407,
"acc_norm": 0.7354112726548496,
"acc_norm_stderr": 0.004402124555058386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556552,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556552
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325635,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325635
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.03210494433751458,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.03210494433751458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5699481865284974,
"acc_stderr": 0.035729543331448094,
"acc_norm": 0.5699481865284974,
"acc_norm_stderr": 0.035729543331448094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6091743119266055,
"acc_stderr": 0.020920058346111055,
"acc_norm": 0.6091743119266055,
"acc_norm_stderr": 0.020920058346111055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.569620253164557,
"acc_stderr": 0.032230171959375976,
"acc_norm": 0.569620253164557,
"acc_norm_stderr": 0.032230171959375976
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123005,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.49586776859504134,
"acc_stderr": 0.045641987674327526,
"acc_norm": 0.49586776859504134,
"acc_norm_stderr": 0.045641987674327526
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456026,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5854700854700855,
"acc_stderr": 0.03227396567623779,
"acc_norm": 0.5854700854700855,
"acc_norm_stderr": 0.03227396567623779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5900383141762452,
"acc_stderr": 0.017587672312336048,
"acc_norm": 0.5900383141762452,
"acc_norm_stderr": 0.017587672312336048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261427,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4533762057877814,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.4533762057877814,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327242,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327242
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.02804594694204239,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.02804594694204239
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35723598435462844,
"acc_stderr": 0.012238615750316505,
"acc_norm": 0.35723598435462844,
"acc_norm_stderr": 0.012238615750316505
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.02007942040808792,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.02007942040808792
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487288,
"mc2": 0.3553221305957241,
"mc2_stderr": 0.014174982761442424
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Undi95__CreativityEngine | 2023-09-11T17:23:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/CreativityEngine
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/CreativityEngine](https://huggingface.co/Undi95/CreativityEngine) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__CreativityEngine\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:22:32.752077](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CreativityEngine/blob/main/results_2023-09-11T17-22-32.752077.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5374017113191415,\n\
\ \"acc_stderr\": 0.03464431782025622,\n \"acc_norm\": 0.5413941551796453,\n\
\ \"acc_norm_stderr\": 0.03462408055465485,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5246393270856821,\n\
\ \"mc2_stderr\": 0.015798380259118193\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670444,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6253734315873332,\n\
\ \"acc_stderr\": 0.004830371317841056,\n \"acc_norm\": 0.8242381995618403,\n\
\ \"acc_norm_stderr\": 0.0037983950550215346\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\
\ \"acc_stderr\": 0.027941727346256308,\n \"acc_norm\": 0.5935483870967742,\n\
\ \"acc_norm_stderr\": 0.027941727346256308\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.02533466708095492,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.02533466708095492\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7027522935779816,\n \"acc_stderr\": 0.019595707224643533,\n \"\
acc_norm\": 0.7027522935779816,\n \"acc_norm_stderr\": 0.019595707224643533\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7432950191570882,\n\
\ \"acc_stderr\": 0.015620480263064524,\n \"acc_norm\": 0.7432950191570882,\n\
\ \"acc_norm_stderr\": 0.015620480263064524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.01658388195860239,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.01658388195860239\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592474,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592474\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087375,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087375\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \"\
acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5246393270856821,\n\
\ \"mc2_stderr\": 0.015798380259118193\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/CreativityEngine
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-22-32.752077.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- results_2023-09-11T17-22-32.752077.parquet
- split: latest
path:
- results_2023-09-11T17-22-32.752077.parquet
---
# Dataset Card for Evaluation run of Undi95/CreativityEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/CreativityEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/CreativityEngine](https://huggingface.co/Undi95/CreativityEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__CreativityEngine",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:22:32.752077](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CreativityEngine/blob/main/results_2023-09-11T17-22-32.752077.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5374017113191415,
"acc_stderr": 0.03464431782025622,
"acc_norm": 0.5413941551796453,
"acc_norm_stderr": 0.03462408055465485,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5246393270856821,
"mc2_stderr": 0.015798380259118193
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670444,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.6253734315873332,
"acc_stderr": 0.004830371317841056,
"acc_norm": 0.8242381995618403,
"acc_norm_stderr": 0.0037983950550215346
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.027941727346256308,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.027941727346256308
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.02533466708095492,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.02533466708095492
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7027522935779816,
"acc_stderr": 0.019595707224643533,
"acc_norm": 0.7027522935779816,
"acc_norm_stderr": 0.019595707224643533
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7432950191570882,
"acc_stderr": 0.015620480263064524,
"acc_norm": 0.7432950191570882,
"acc_norm_stderr": 0.015620480263064524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.01658388195860239,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.01658388195860239
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626592,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.027237415094592474,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.027237415094592474
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087375,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087375
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5246393270856821,
"mc2_stderr": 0.015798380259118193
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
valeriamontero/prueba | 2023-09-11T17:32:29.000Z | [
"region:us"
] | valeriamontero | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch | 2023-09-11T17:29:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:27:50.905630](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch/blob/main/results_2023-09-11T17-27-50.905630.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5592638050943857,\n\
\ \"acc_stderr\": 0.03434942370122012,\n \"acc_norm\": 0.563408614169366,\n\
\ \"acc_norm_stderr\": 0.0343288985684629,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4209102222161438,\n\
\ \"mc2_stderr\": 0.014266806466458887\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171865,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6211909978092014,\n\
\ \"acc_stderr\": 0.004840990593494691,\n \"acc_norm\": 0.8256323441545509,\n\
\ \"acc_norm_stderr\": 0.003786498856769125\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.02524277098712618,\n \
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.02524277098712618\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683526,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683526\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684594,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684594\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.01841528635141642,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.01841528635141642\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.041733491480835,\n \"acc_norm\"\
: 0.7024793388429752,\n \"acc_norm_stderr\": 0.041733491480835\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700917,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700917\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n\
\ \"acc_stderr\": 0.015162024152278446,\n \"acc_norm\": 0.7650063856960408,\n\
\ \"acc_norm_stderr\": 0.015162024152278446\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.02577029208297726,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.02577029208297726\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.016635838341631914,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.016635838341631914\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567654,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287248,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287248\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4209102222161438,\n\
\ \"mc2_stderr\": 0.014266806466458887\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-27-50.905630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-27-50.905630.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-27-50.905630.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-27-50.905630.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_27_50.905630
path:
- results_2023-09-11T17-27-50.905630.parquet
- split: latest
path:
- results_2023-09-11T17-27-50.905630.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:27:50.905630](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w-3_epoch/blob/main/results_2023-09-11T17-27-50.905630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5592638050943857,
"acc_stderr": 0.03434942370122012,
"acc_norm": 0.563408614169366,
"acc_norm_stderr": 0.0343288985684629,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4209102222161438,
"mc2_stderr": 0.014266806466458887
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171865,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221009
},
"harness|hellaswag|10": {
"acc": 0.6211909978092014,
"acc_stderr": 0.004840990593494691,
"acc_norm": 0.8256323441545509,
"acc_norm_stderr": 0.003786498856769125
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.02524277098712618,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.02524277098712618
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683526,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683526
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684594,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684594
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.01841528635141642,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.01841528635141642
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.041733491480835,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.041733491480835
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700917,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700917
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278446,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278446
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.02577029208297726,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.02577029208297726
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.016635838341631914,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.016635838341631914
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567654,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287248,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287248
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4209102222161438,
"mc2_stderr": 0.014266806466458887
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft2 | 2023-09-11T17:30:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikivis/gpt2-large-lora-sft2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikivis/gpt2-large-lora-sft2](https://huggingface.co/Mikivis/gpt2-large-lora-sft2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:29:20.657101](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft2/blob/main/results_2023-09-11T17-29-20.657101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24889330844605004,\n\
\ \"acc_stderr\": 0.03135888613186268,\n \"acc_norm\": 0.2505879293387577,\n\
\ \"acc_norm_stderr\": 0.031370452952568094,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4030671128915194,\n\
\ \"mc2_stderr\": 0.014369391219834174\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768671,\n\
\ \"acc_norm\": 0.26621160409556316,\n \"acc_norm_stderr\": 0.012915774781523214\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35839474208325034,\n\
\ \"acc_stderr\": 0.004785488626807567,\n \"acc_norm\": 0.4268074088826927,\n\
\ \"acc_norm_stderr\": 0.0049360298276720374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.0402477840197711,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.0402477840197711\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.038061426873099935,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.038061426873099935\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.033954900208561116,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.033954900208561116\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909902,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909902\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463185,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463185\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23119266055045873,\n \"acc_stderr\": 0.018075750241633146,\n \"\
acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.018075750241633146\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046934,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.27330779054916987,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.024476994076247323,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.024476994076247323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.01448750085285042,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.01448750085285042\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.01137165829431152,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.01137165829431152\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23366013071895425,\n \"acc_stderr\": 0.017119158496044503,\n \
\ \"acc_norm\": 0.23366013071895425,\n \"acc_norm_stderr\": 0.017119158496044503\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4030671128915194,\n\
\ \"mc2_stderr\": 0.014369391219834174\n }\n}\n```"
repo_url: https://huggingface.co/Mikivis/gpt2-large-lora-sft2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-29-20.657101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-29-20.657101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-29-20.657101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-29-20.657101.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_29_20.657101
path:
- results_2023-09-11T17-29-20.657101.parquet
- split: latest
path:
- results_2023-09-11T17-29-20.657101.parquet
---
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-sft2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-sft2](https://huggingface.co/Mikivis/gpt2-large-lora-sft2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:29:20.657101](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft2/blob/main/results_2023-09-11T17-29-20.657101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24889330844605004,
"acc_stderr": 0.03135888613186268,
"acc_norm": 0.2505879293387577,
"acc_norm_stderr": 0.031370452952568094,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4030671128915194,
"mc2_stderr": 0.014369391219834174
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768671,
"acc_norm": 0.26621160409556316,
"acc_norm_stderr": 0.012915774781523214
},
"harness|hellaswag|10": {
"acc": 0.35839474208325034,
"acc_stderr": 0.004785488626807567,
"acc_norm": 0.4268074088826927,
"acc_norm_stderr": 0.0049360298276720374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.0402477840197711,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.0402477840197711
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.038061426873099935,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.038061426873099935
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.033954900208561116,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.033954900208561116
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909902,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909902
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463185,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463185
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046934,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.024476994076247323,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.024476994076247323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285042,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285042
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.01137165829431152,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.01137165829431152
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23366013071895425,
"acc_stderr": 0.017119158496044503,
"acc_norm": 0.23366013071895425,
"acc_norm_stderr": 0.017119158496044503
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813292,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813292
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4030671128915194,
"mc2_stderr": 0.014369391219834174
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ | 2023-09-11T17:33:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:32:08.880546](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ/blob/main/results_2023-09-11T17-32-08.880546.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47448637397526006,\n\
\ \"acc_stderr\": 0.035045561337073074,\n \"acc_norm\": 0.47815943269582295,\n\
\ \"acc_norm_stderr\": 0.03502896256034419,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5345534336987072,\n\
\ \"mc2_stderr\": 0.01574114618973484\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.014542104569955265,\n\
\ \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137993\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6078470424218283,\n\
\ \"acc_stderr\": 0.004872326888655519,\n \"acc_norm\": 0.8032264489145589,\n\
\ \"acc_norm_stderr\": 0.003967472072468517\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596329,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596329\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097852,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097852\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6697247706422018,\n \"acc_stderr\": 0.020164466336342977,\n \"\
acc_norm\": 0.6697247706422018,\n \"acc_norm_stderr\": 0.020164466336342977\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6160337552742616,\n \"acc_stderr\": 0.03165867806410668,\n \
\ \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.03165867806410668\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834948,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.01685739124747255,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.01685739124747255\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475349,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475349\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.02830457667314112,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.02830457667314112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5493827160493827,\n \"acc_stderr\": 0.027684721415656203,\n\
\ \"acc_norm\": 0.5493827160493827,\n \"acc_norm_stderr\": 0.027684721415656203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39308996088657105,\n\
\ \"acc_stderr\": 0.012474899613873961,\n \"acc_norm\": 0.39308996088657105,\n\
\ \"acc_norm_stderr\": 0.012474899613873961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4803921568627451,\n \"acc_stderr\": 0.020212274976302957,\n \
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.020212274976302957\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.032801882053486414,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.032801882053486414\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5345534336987072,\n\
\ \"mc2_stderr\": 0.01574114618973484\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-08.880546.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- results_2023-09-11T17-32-08.880546.parquet
- split: latest
path:
- results_2023-09-11T17-32-08.880546.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:32:08.880546](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ/blob/main/results_2023-09-11T17-32-08.880546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47448637397526006,
"acc_stderr": 0.035045561337073074,
"acc_norm": 0.47815943269582295,
"acc_norm_stderr": 0.03502896256034419,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5345534336987072,
"mc2_stderr": 0.01574114618973484
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.014542104569955265,
"acc_norm": 0.5699658703071673,
"acc_norm_stderr": 0.014467631559137993
},
"harness|hellaswag|10": {
"acc": 0.6078470424218283,
"acc_stderr": 0.004872326888655519,
"acc_norm": 0.8032264489145589,
"acc_norm_stderr": 0.003967472072468517
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596329,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596329
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097852,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097852
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342977,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.03165867806410668,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.03165867806410668
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834948,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.01685739124747255,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.01685739124747255
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475349,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475349
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.02830457667314112,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.02830457667314112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5493827160493827,
"acc_stderr": 0.027684721415656203,
"acc_norm": 0.5493827160493827,
"acc_norm_stderr": 0.027684721415656203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878638,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39308996088657105,
"acc_stderr": 0.012474899613873961,
"acc_norm": 0.39308996088657105,
"acc_norm_stderr": 0.012474899613873961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.020212274976302957,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.020212274976302957
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.032801882053486414,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.032801882053486414
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5345534336987072,
"mc2_stderr": 0.01574114618973484
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4 | 2023-09-11T17:34:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:32:59.033048](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4/blob/main/results_2023-09-11T17-32-59.033048.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47191447641127826,\n\
\ \"acc_stderr\": 0.03501072519384273,\n \"acc_norm\": 0.4758359941204308,\n\
\ \"acc_norm_stderr\": 0.035006442383521436,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041838,\n \"mc2\": 0.415907411933743,\n\
\ \"mc2_stderr\": 0.014510963019987853\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3643344709897611,\n \"acc_stderr\": 0.014063260279882412,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009123\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5421230830511851,\n\
\ \"acc_stderr\": 0.004972042602001381,\n \"acc_norm\": 0.7308305118502291,\n\
\ \"acc_norm_stderr\": 0.0044262176549180006\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674474,\n\
\ \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674474\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.041443118108781506,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.041443118108781506\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842509,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842509\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n\
\ \"acc_stderr\": 0.02835863485983693,\n \"acc_norm\": 0.5387096774193548,\n\
\ \"acc_norm_stderr\": 0.02835863485983693\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315946,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315946\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507385,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507385\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.02059808200993738,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.02059808200993738\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402543,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402543\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449296,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449296\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775088,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836185,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836185\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.0486577757041077,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.0486577757041077\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6526181353767561,\n\
\ \"acc_stderr\": 0.01702667174865573,\n \"acc_norm\": 0.6526181353767561,\n\
\ \"acc_norm_stderr\": 0.01702667174865573\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n\
\ \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.5401929260450161,\n\
\ \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.012337391684530312,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.012337391684530312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45098039215686275,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041838,\n \"mc2\": 0.415907411933743,\n\
\ \"mc2_stderr\": 0.014510963019987853\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-59.033048.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- results_2023-09-11T17-32-59.033048.parquet
- split: latest
path:
- results_2023-09-11T17-32-59.033048.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:32:59.033048](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4/blob/main/results_2023-09-11T17-32-59.033048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47191447641127826,
"acc_stderr": 0.03501072519384273,
"acc_norm": 0.4758359941204308,
"acc_norm_stderr": 0.035006442383521436,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041838,
"mc2": 0.415907411933743,
"mc2_stderr": 0.014510963019987853
},
"harness|arc:challenge|25": {
"acc": 0.3643344709897611,
"acc_stderr": 0.014063260279882412,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009123
},
"harness|hellaswag|10": {
"acc": 0.5421230830511851,
"acc_stderr": 0.004972042602001381,
"acc_norm": 0.7308305118502291,
"acc_norm_stderr": 0.0044262176549180006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674474,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674474
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.041443118108781506,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.041443118108781506
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842509,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842509
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.02835863485983693,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.02835863485983693
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315946,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315946
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507385,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507385
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.02059808200993738,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.02059808200993738
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402543,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402543
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449296,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449296
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775088,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836185,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836185
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.0486577757041077,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.0486577757041077
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6526181353767561,
"acc_stderr": 0.01702667174865573,
"acc_norm": 0.6526181353767561,
"acc_norm_stderr": 0.01702667174865573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530312,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339193,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041838,
"mc2": 0.415907411933743,
"mc2_stderr": 0.014510963019987853
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BryanDRC/first-dataset | 2023-09-11T17:33:35.000Z | [
"region:us"
] | BryanDRC | null | null | null | 0 | 0 | Entry not found |
valeriamontero/test | 2023-09-11T17:36:50.000Z | [
"region:us"
] | valeriamontero | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b | 2023-09-11T17:40:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Azure99/blossom-v2-llama2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v2-llama2-7b](https://huggingface.co/Azure99/blossom-v2-llama2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:39:22.579303](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b/blob/main/results_2023-09-11T17-39-22.579303.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5178148983618301,\n\
\ \"acc_stderr\": 0.034987688011471604,\n \"acc_norm\": 0.5215426213433675,\n\
\ \"acc_norm_stderr\": 0.03497337234024997,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960505,\n \"mc2\": 0.4683906945088401,\n\
\ \"mc2_stderr\": 0.015179636754528561\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5085324232081911,\n \"acc_stderr\": 0.014609263165632186,\n\
\ \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.01456229107360123\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5981876120294762,\n\
\ \"acc_stderr\": 0.0048926244909372205,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.0040949719808920804\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946308,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946308\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347364,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347364\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969567,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735705,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735705\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115072,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115072\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.70242656449553,\n\
\ \"acc_stderr\": 0.01634911191290942,\n \"acc_norm\": 0.70242656449553,\n\
\ \"acc_norm_stderr\": 0.01634911191290942\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.026772990653361826,\n\
\ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.026772990653361826\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325946,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325946\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5401234567901234,\n \"acc_stderr\": 0.027731022753539277,\n\
\ \"acc_norm\": 0.5401234567901234,\n \"acc_norm_stderr\": 0.027731022753539277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235565,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46405228758169936,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087565,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087565\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960505,\n \"mc2\": 0.4683906945088401,\n\
\ \"mc2_stderr\": 0.015179636754528561\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v2-llama2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-39-22.579303.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- results_2023-09-11T17-39-22.579303.parquet
- split: latest
path:
- results_2023-09-11T17-39-22.579303.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v2-llama2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Azure99/blossom-v2-llama2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-llama2-7b](https://huggingface.co/Azure99/blossom-v2-llama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:39:22.579303](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b/blob/main/results_2023-09-11T17-39-22.579303.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5178148983618301,
"acc_stderr": 0.034987688011471604,
"acc_norm": 0.5215426213433675,
"acc_norm_stderr": 0.03497337234024997,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960505,
"mc2": 0.4683906945088401,
"mc2_stderr": 0.015179636754528561
},
"harness|arc:challenge|25": {
"acc": 0.5085324232081911,
"acc_stderr": 0.014609263165632186,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.01456229107360123
},
"harness|hellaswag|10": {
"acc": 0.5981876120294762,
"acc_stderr": 0.0048926244909372205,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.0040949719808920804
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946308,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946308
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347364,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347364
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969567,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735705,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735705
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115072,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115072
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.70242656449553,
"acc_stderr": 0.01634911191290942,
"acc_norm": 0.70242656449553,
"acc_norm_stderr": 0.01634911191290942
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.026772990653361826,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.026772990653361826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325946,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325946
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5401234567901234,
"acc_stderr": 0.027731022753539277,
"acc_norm": 0.5401234567901234,
"acc_norm_stderr": 0.027731022753539277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235565,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087565,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087565
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960505,
"mc2": 0.4683906945088401,
"mc2_stderr": 0.015179636754528561
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16 | 2023-09-11T17:49:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:48:14.644615](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16/blob/main/results_2023-09-11T17-48-14.644615.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.559538865205019,\n\
\ \"acc_stderr\": 0.034238456372930964,\n \"acc_norm\": 0.5639020577001664,\n\
\ \"acc_norm_stderr\": 0.03421645137425091,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.43605850067412455,\n\
\ \"mc2_stderr\": 0.014074574930930854\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6195976897032464,\n\
\ \"acc_stderr\": 0.004844935327599204,\n \"acc_norm\": 0.8258315076677952,\n\
\ \"acc_norm_stderr\": 0.0037847921724660626\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411023,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411023\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.032321469162244675,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.032321469162244675\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n\
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.018224078117299074,\n \"\
acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.018224078117299074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569506,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569506\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922726,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922726\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n\
\ \"acc_stderr\": 0.01629533232815581,\n \"acc_norm\": 0.3877094972067039,\n\
\ \"acc_norm_stderr\": 0.01629533232815581\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625672,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625672\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788233,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.43605850067412455,\n\
\ \"mc2_stderr\": 0.014074574930930854\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-48-14.644615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-48-14.644615.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-48-14.644615.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-48-14.644615.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_48_14.644615
path:
- results_2023-09-11T17-48-14.644615.parquet
- split: latest
path:
- results_2023-09-11T17-48-14.644615.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:48:14.644615](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_eli5_1024_r_64_alpha_16/blob/main/results_2023-09-11T17-48-14.644615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.559538865205019,
"acc_stderr": 0.034238456372930964,
"acc_norm": 0.5639020577001664,
"acc_norm_stderr": 0.03421645137425091,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.43605850067412455,
"mc2_stderr": 0.014074574930930854
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536588
},
"harness|hellaswag|10": {
"acc": 0.6195976897032464,
"acc_stderr": 0.004844935327599204,
"acc_norm": 0.8258315076677952,
"acc_norm_stderr": 0.0037847921724660626
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411023,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411023
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.032321469162244675,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.032321469162244675
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.018224078117299074,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.018224078117299074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922726,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.01629533232815581,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.01629533232815581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625672,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625672
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788233,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.43605850067412455,
"mc2_stderr": 0.014074574930930854
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
karmanov5/nikakern | 2023-09-11T17:50:32.000Z | [
"region:us"
] | karmanov5 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16 | 2023-09-11T17:52:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:50:57.787560](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16/blob/main/results_2023-09-11T17-50-57.787560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5545920148625109,\n\
\ \"acc_stderr\": 0.03438427108776292,\n \"acc_norm\": 0.5588094004719787,\n\
\ \"acc_norm_stderr\": 0.03436363216969358,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35747843581335975,\n\
\ \"mc2_stderr\": 0.013497434097399959\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5503412969283277,\n \"acc_stderr\": 0.014537144444284738,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472432\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.614618601872137,\n\
\ \"acc_stderr\": 0.004856906473719382,\n \"acc_norm\": 0.8233419637522406,\n\
\ \"acc_norm_stderr\": 0.0038059961194403767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307702,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307702\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042774,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042774\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\"\
: 0.763302752293578,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n\
\ \"acc_stderr\": 0.03385177976044812,\n \"acc_norm\": 0.4398148148148148,\n\
\ \"acc_norm_stderr\": 0.03385177976044812\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.037149084099355745,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.037149084099355745\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584183,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208173,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208173\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290282,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011628,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011628\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906422,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906422\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35747843581335975,\n\
\ \"mc2_stderr\": 0.013497434097399959\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-50-57.787560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-50-57.787560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-50-57.787560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-50-57.787560.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_50_57.787560
path:
- results_2023-09-11T17-50-57.787560.parquet
- split: latest
path:
- results_2023-09-11T17-50-57.787560.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:50:57.787560](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16/blob/main/results_2023-09-11T17-50-57.787560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5545920148625109,
"acc_stderr": 0.03438427108776292,
"acc_norm": 0.5588094004719787,
"acc_norm_stderr": 0.03436363216969358,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35747843581335975,
"mc2_stderr": 0.013497434097399959
},
"harness|arc:challenge|25": {
"acc": 0.5503412969283277,
"acc_stderr": 0.014537144444284738,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.014370358632472432
},
"harness|hellaswag|10": {
"acc": 0.614618601872137,
"acc_stderr": 0.004856906473719382,
"acc_norm": 0.8233419637522406,
"acc_norm_stderr": 0.0038059961194403767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307702,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307702
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.037149084099355745,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.037149084099355745
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584183,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208173,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208173
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290282,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011628,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011628
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906422,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906422
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35747843581335975,
"mc2_stderr": 0.013497434097399959
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged | 2023-09-21T21:37:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of StudentLLM/Alpagasus-2-13b-QLoRA-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [StudentLLM/Alpagasus-2-13b-QLoRA-merged](https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T21:35:59.433556](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged/blob/main/results_2023-09-21T21-35-59.433556.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.556661693391276,\n\
\ \"acc_stderr\": 0.03431116511747462,\n \"acc_norm\": 0.5609236537466653,\n\
\ \"acc_norm_stderr\": 0.034289447584422525,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.38652119986050765,\n\
\ \"mc2_stderr\": 0.014265445465776364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256517,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6181039633539136,\n\
\ \"acc_stderr\": 0.0048485832436066835,\n \"acc_norm\": 0.8243377813184625,\n\
\ \"acc_norm_stderr\": 0.003797548252851621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6612903225806451,\n \"acc_stderr\": 0.02692344605930284,\n \"\
acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.02692344605930284\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7669724770642202,\n \"acc_stderr\": 0.01812566918086151,\n \"\
acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.01812566918086151\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542108,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542108\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977257,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977257\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n\
\ \"acc_stderr\": 0.015995644947299232,\n \"acc_norm\": 0.3541899441340782,\n\
\ \"acc_norm_stderr\": 0.015995644947299232\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159603,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159603\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.38652119986050765,\n\
\ \"mc2_stderr\": 0.014265445465776364\n }\n}\n```"
repo_url: https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-35-59.433556.parquet'
- config_name: results
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- results_2023-09-11T18-18-21.353761.parquet
- split: 2023_09_21T21_35_59.433556
path:
- results_2023-09-21T21-35-59.433556.parquet
- split: latest
path:
- results_2023-09-21T21-35-59.433556.parquet
---
# Dataset Card for Evaluation run of StudentLLM/Alpagasus-2-13b-QLoRA-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [StudentLLM/Alpagasus-2-13b-QLoRA-merged](https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T21:35:59.433556](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged/blob/main/results_2023-09-21T21-35-59.433556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.556661693391276,
"acc_stderr": 0.03431116511747462,
"acc_norm": 0.5609236537466653,
"acc_norm_stderr": 0.034289447584422525,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.38652119986050765,
"mc2_stderr": 0.014265445465776364
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256517,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938215
},
"harness|hellaswag|10": {
"acc": 0.6181039633539136,
"acc_stderr": 0.0048485832436066835,
"acc_norm": 0.8243377813184625,
"acc_norm_stderr": 0.003797548252851621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.01812566918086151,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.01812566918086151
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542108,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542108
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977257,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977257
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.015995644947299232,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.015995644947299232
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159603,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159603
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.38652119986050765,
"mc2_stderr": 0.014265445465776364
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SeyedAli/Persian-Text-Paraphrasing | 2023-09-11T18:32:14.000Z | [
"license:mit",
"region:us"
] | SeyedAli | null | null | null | 1 | 0 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_text
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 126104
num_examples: 800
- name: test
num_bytes: 31702
num_examples: 200
download_size: 92723
dataset_size: 157806
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16 | 2023-09-11T18:34:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T18:33:35.889629](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16/blob/main/results_2023-09-11T18-33-35.889629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.561943014174084,\n\
\ \"acc_stderr\": 0.03429623620221756,\n \"acc_norm\": 0.5661656936826862,\n\
\ \"acc_norm_stderr\": 0.03427639021174252,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522515,\n \"mc2\": 0.39750918100971155,\n\
\ \"mc2_stderr\": 0.014157554537767498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304037,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650655\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6154152559251145,\n\
\ \"acc_stderr\": 0.004855027248398162,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246593\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286648,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286648\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.02721888977330877,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.02721888977330877\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.02533466708095492,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.02533466708095492\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\"\
: 0.763302752293578,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \"\
acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.01628667487910103,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.01628667487910103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563664,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522515,\n \"mc2\": 0.39750918100971155,\n\
\ \"mc2_stderr\": 0.014157554537767498\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-33-35.889629.parquet'
- config_name: results
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- results_2023-09-11T18-33-35.889629.parquet
- split: latest
path:
- results_2023-09-11T18-33-35.889629.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T18:33:35.889629](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16/blob/main/results_2023-09-11T18-33-35.889629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.561943014174084,
"acc_stderr": 0.03429623620221756,
"acc_norm": 0.5661656936826862,
"acc_norm_stderr": 0.03427639021174252,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522515,
"mc2": 0.39750918100971155,
"mc2_stderr": 0.014157554537767498
},
"harness|arc:challenge|25": {
"acc": 0.5307167235494881,
"acc_stderr": 0.014583792546304037,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650655
},
"harness|hellaswag|10": {
"acc": 0.6154152559251145,
"acc_stderr": 0.004855027248398162,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246593
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286648,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286648
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.02533466708095492,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.02533466708095492
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937153,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.02590663263101613,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.02590663263101613
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.01628667487910103,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.01628667487910103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563664,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522515,
"mc2": 0.39750918100971155,
"mc2_stderr": 0.014157554537767498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bongo2112/dreambooth-training-images | 2023-09-16T10:28:16.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
LLMGlobalyTest/da | 2023-09-11T19:30:54.000Z | [
"region:us"
] | LLMGlobalyTest | null | null | null | 0 | 0 | Entry not found |
EZUNIGAF/DA001 | 2023-09-11T19:31:57.000Z | [
"region:us"
] | EZUNIGAF | null | null | null | 0 | 0 | Entry not found |
utangechan/gokjre | 2023-09-11T19:45:45.000Z | [
"region:us"
] | utangechan | null | null | null | 0 | 0 | Entry not found |
KnowledgeSentry/IrisTest | 2023-09-11T19:52:14.000Z | [
"license:cc-by-3.0",
"region:us"
] | KnowledgeSentry | null | null | null | 0 | 0 | ---
license: cc-by-3.0
---
|
bioblendcbdgummiesfored2023/bioblendcbdgummiesfored | 2023-09-11T21:57:02.000Z | [
"license:afl-3.0",
"region:us"
] | bioblendcbdgummiesfored2023 | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
atishmangaming/kingspeare | 2023-09-11T20:32:02.000Z | [
"region:us"
] | atishmangaming | null | null | null | 0 | 0 | Entry not found |
zoroko/filmyczsk | 2023-09-11T20:18:05.000Z | [
"region:us"
] | zoroko | null | null | null | 0 | 0 | Entry not found |
Roscall/Helenshapiro | 2023-09-11T20:18:53.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
cuhunuzhan/filmyzdarma | 2023-09-11T20:23:17.000Z | [
"region:us"
] | cuhunuzhan | null | null | null | 0 | 0 | Entry not found |
cuhunuzhan/filmyase | 2023-09-11T20:34:05.000Z | [
"region:us"
] | cuhunuzhan | null | null | null | 0 | 0 | Entry not found |
Roscall/helen | 2023-09-11T20:27:09.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
filmyczzdarma/baka | 2023-09-12T08:00:37.000Z | [
"region:us"
] | filmyczzdarma | null | null | null | 0 | 0 | Entry not found |
arnmig/github-issues | 2023-09-11T21:06:17.000Z | [
"region:us"
] | arnmig | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
dtype: string
- name: labels
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
dtype: string
- name: assignees
dtype: string
- name: milestone
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: string
- name: draft
dtype: string
- name: pull_request
dtype: string
- name: body
dtype: string
- name: reactions
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: string
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 32536203
num_examples: 6214
download_size: 8102507
dataset_size: 32536203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Carcajo/for_game | 2023-09-11T21:22:40.000Z | [
"license:llama2",
"region:us"
] | Carcajo | null | null | null | 0 | 0 | ---
license: llama2
---
|
israel/amh-mine | 2023-09-11T21:26:51.000Z | [
"region:us"
] | israel | null | null | null | 0 | 0 | Entry not found |
bioblendcbdgummiesfored2023/bioblendcbdgummiesfored-website | 2023-09-11T21:36:34.000Z | [
"license:bigscience-bloom-rail-1.0",
"region:us"
] | bioblendcbdgummiesfored2023 | null | null | null | 0 | 0 | ---
license: bigscience-bloom-rail-1.0
---
|
open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1 | 2023-09-11T21:48:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airocoder-34b-2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T21:47:37.298626](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-09-11T21-47-37.298626.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5075628766857376,\n\
\ \"acc_stderr\": 0.03521920174371659,\n \"acc_norm\": 0.5112471770212741,\n\
\ \"acc_norm_stderr\": 0.035208668610610463,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842888,\n \"mc2\": 0.40699233950685265,\n\
\ \"mc2_stderr\": 0.014740524560122202\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5059726962457338,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.5418088737201365,\n \"acc_norm_stderr\": 0.0145602203087147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5568611830312686,\n\
\ \"acc_stderr\": 0.004957410545559409,\n \"acc_norm\": 0.7383987253535153,\n\
\ \"acc_norm_stderr\": 0.004386083683839625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562427,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n\
\ \"acc_stderr\": 0.027906150826041146,\n \"acc_norm\": 0.5967741935483871,\n\
\ \"acc_norm_stderr\": 0.027906150826041146\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998575,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.02523038123893484,\n \
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.02523038123893484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.0207283684576385,\n \"acc_norm\"\
: 0.6275229357798165,\n \"acc_norm_stderr\": 0.0207283684576385\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n\
\ \"acc_stderr\": 0.032568505702936484,\n \"acc_norm\": 0.35185185185185186,\n\
\ \"acc_norm_stderr\": 0.032568505702936484\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6309067688378033,\n\
\ \"acc_stderr\": 0.017256283109124613,\n \"acc_norm\": 0.6309067688378033,\n\
\ \"acc_norm_stderr\": 0.017256283109124613\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.026636539741116093,\n\
\ \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.026636539741116093\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.01463518561652784,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.01463518561652784\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.02762873715566877,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.02762873715566877\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650137,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650137\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35919165580182527,\n\
\ \"acc_stderr\": 0.01225338618758424,\n \"acc_norm\": 0.35919165580182527,\n\
\ \"acc_norm_stderr\": 0.01225338618758424\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687765,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687765\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45588235294117646,\n \"acc_stderr\": 0.020148939420415738,\n \
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.020148939420415738\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.031512360446742695,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.031512360446742695\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842888,\n \"mc2\": 0.40699233950685265,\n\
\ \"mc2_stderr\": 0.014740524560122202\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airocoder-34b-2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet'
- config_name: results
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- results_2023-09-11T21-47-37.298626.parquet
- split: latest
path:
- results_2023-09-11T21-47-37.298626.parquet
---
# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airocoder-34b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T21:47:37.298626](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-09-11T21-47-37.298626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5075628766857376,
"acc_stderr": 0.03521920174371659,
"acc_norm": 0.5112471770212741,
"acc_norm_stderr": 0.035208668610610463,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842888,
"mc2": 0.40699233950685265,
"mc2_stderr": 0.014740524560122202
},
"harness|arc:challenge|25": {
"acc": 0.5059726962457338,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5418088737201365,
"acc_norm_stderr": 0.0145602203087147
},
"harness|hellaswag|10": {
"acc": 0.5568611830312686,
"acc_stderr": 0.004957410545559409,
"acc_norm": 0.7383987253535153,
"acc_norm_stderr": 0.004386083683839625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562427,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5967741935483871,
"acc_stderr": 0.027906150826041146,
"acc_norm": 0.5967741935483871,
"acc_norm_stderr": 0.027906150826041146
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.02523038123893484,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.02523038123893484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.0207283684576385,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.0207283684576385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6309067688378033,
"acc_stderr": 0.017256283109124613,
"acc_norm": 0.6309067688378033,
"acc_norm_stderr": 0.017256283109124613
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.026636539741116093,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.026636539741116093
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.01463518561652784,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.01463518561652784
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.02762873715566877,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.02762873715566877
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650137,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650137
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35919165580182527,
"acc_stderr": 0.01225338618758424,
"acc_norm": 0.35919165580182527,
"acc_norm_stderr": 0.01225338618758424
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.020148939420415738,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.020148939420415738
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.031512360446742695,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.031512360446742695
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842888,
"mc2": 0.40699233950685265,
"mc2_stderr": 0.014740524560122202
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bioblendcbdgummiesfored2023/bioblendcbdgummiesfored-shop | 2023-09-11T21:56:24.000Z | [
"region:us"
] | bioblendcbdgummiesfored2023 | null | null | null | 0 | 0 | Entry not found |
marasama/nva-kasugayama | 2023-09-11T21:52:39.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
bioblendcbdgummiesfored2023/bioblendcbdgummiesfor-ed | 2023-09-11T22:00:09.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | bioblendcbdgummiesfored2023 | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
tienti0000/MizulinaSVC | 2023-09-11T23:07:39.000Z | [
"region:us"
] | tienti0000 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096 | 2023-09-11T22:54:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T22:53:03.522910](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096/blob/main/results_2023-09-11T22-53-03.522910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23398687299381754,\n\
\ \"acc_stderr\": 0.030826453989604444,\n \"acc_norm\": 0.23529272064862658,\n\
\ \"acc_norm_stderr\": 0.030844373296834503,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.4228833687052385,\n\
\ \"mc2_stderr\": 0.014964083398274203\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19368600682593856,\n \"acc_stderr\": 0.01154842540997854,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453954\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.291575383389763,\n\
\ \"acc_stderr\": 0.004535589759202655,\n \"acc_norm\": 0.3225453096992631,\n\
\ \"acc_norm_stderr\": 0.004664950168300714\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.02193587808118476,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.02193587808118476\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n\
\ \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n\
\ \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938062,\n\
\ \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938062\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860667,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860667\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095929,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095929\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.18487394957983194,\n \"acc_stderr\": 0.025215992877954202,\n\
\ \"acc_norm\": 0.18487394957983194,\n \"acc_norm_stderr\": 0.025215992877954202\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"\
acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458064,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200427,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.01549108895149458,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.01549108895149458\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729487,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729487\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245231,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245231\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.4228833687052385,\n\
\ \"mc2_stderr\": 0.014964083398274203\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|arc:challenge|25_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hellaswag|10_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T22-53-03.522910.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T22-53-03.522910.parquet'
- config_name: results
data_files:
- split: 2023_09_11T22_53_03.522910
path:
- results_2023-09-11T22-53-03.522910.parquet
- split: latest
path:
- results_2023-09-11T22-53-03.522910.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T22:53:03.522910](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096/blob/main/results_2023-09-11T22-53-03.522910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23398687299381754,
"acc_stderr": 0.030826453989604444,
"acc_norm": 0.23529272064862658,
"acc_norm_stderr": 0.030844373296834503,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.4228833687052385,
"mc2_stderr": 0.014964083398274203
},
"harness|arc:challenge|25": {
"acc": 0.19368600682593856,
"acc_stderr": 0.01154842540997854,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453954
},
"harness|hellaswag|10": {
"acc": 0.291575383389763,
"acc_stderr": 0.004535589759202655,
"acc_norm": 0.3225453096992631,
"acc_norm_stderr": 0.004664950168300714
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.02193587808118476,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.02193587808118476
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938062,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938062
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860667,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860667
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095929,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095929
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18487394957983194,
"acc_stderr": 0.025215992877954202,
"acc_norm": 0.18487394957983194,
"acc_norm_stderr": 0.025215992877954202
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458064,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200427,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.01549108895149458,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.01549108895149458
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245231,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245231
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.4228833687052385,
"mc2_stderr": 0.014964083398274203
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ayoungtech/RealismCERevolution | 2023-09-12T00:05:41.000Z | [
"region:us"
] | ayoungtech | null | null | null | 0 | 0 | Entry not found |
tienti0000/HovanskiySVC | 2023-09-11T23:14:32.000Z | [
"region:us"
] | tienti0000 | null | null | null | 0 | 0 | Entry not found |
tienti0000/SvetovSVC | 2023-09-11T23:17:03.000Z | [
"region:us"
] | tienti0000 | null | null | null | 0 | 0 | Entry not found |
tienti0000/KazSVC | 2023-09-11T23:24:40.000Z | [
"region:us"
] | tienti0000 | null | null | null | 0 | 0 | Entry not found |
tienti0000/NelubovaSVC | 2023-09-11T23:24:41.000Z | [
"region:us"
] | tienti0000 | null | null | null | 0 | 0 | Entry not found |
im-Kitsch/minari_d4rl | 2023-09-13T13:21:04.000Z | [
"task_categories:reinforcement-learning",
"license:apache-2.0",
"region:us"
] | im-Kitsch | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- reinforcement-learning
---
# transfer from d4rl dataset to minari dataset
transfer scripts and validation are in transfer.py
1. clone the repo
```
$ git clone https://huggingface.co/datasets/im-Kitsch/minari_d4rl
```
2. copy the file to minari root (default is ~/.minari)
```
mv minari_d4rl/datasets ~/.minari/datasets
```
# todo
infos like `infos/qvel` are not saved since the interface is not stable yet and those infos cannot be read directly. |
bananabot212/kuontol | 2023-09-12T01:24:57.000Z | [
"region:us"
] | bananabot212 | null | null | null | 0 | 0 | Entry not found |
choco9966/results | 2023-09-14T16:21:16.000Z | [
"region:us"
] | choco9966 | null | null | null | 0 | 0 | Entry not found |
xtr8/john | 2023-09-12T01:28:46.000Z | [
"license:other",
"region:us"
] | xtr8 | null | null | null | 0 | 0 | ---
license: other
---
|
seansullivan/PCone-Integrations | 2023-09-12T01:43:53.000Z | [
"license:other",
"region:us"
] | seansullivan | null | null | null | 0 | 0 | ---
license: other
---
|
open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat | 2023-09-12T02:17:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of quantumaikr/llama-2-70B-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/llama-2-70B-chat](https://huggingface.co/quantumaikr/llama-2-70B-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T02:16:02.997699](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat/blob/main/results_2023-09-12T02-16-02.997699.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6907220885493215,\n\
\ \"acc_stderr\": 0.03128215576907498,\n \"acc_norm\": 0.6945100749974468,\n\
\ \"acc_norm_stderr\": 0.03125451504899436,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5731438157700953,\n\
\ \"mc2_stderr\": 0.014660092779943103\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839157,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6809400517825135,\n\
\ \"acc_stderr\": 0.0046515972099930875,\n \"acc_norm\": 0.869448317068313,\n\
\ \"acc_norm_stderr\": 0.003362208481557298\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.0314895582974553,\n\
\ \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.0314895582974553\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334333,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761176,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761176\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573975,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093448,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093448\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5251396648044693,\n\
\ \"acc_stderr\": 0.01670135084268263,\n \"acc_norm\": 0.5251396648044693,\n\
\ \"acc_norm_stderr\": 0.01670135084268263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5664928292046936,\n\
\ \"acc_stderr\": 0.01265681038398397,\n \"acc_norm\": 0.5664928292046936,\n\
\ \"acc_norm_stderr\": 0.01265681038398397\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7581699346405228,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5731438157700953,\n\
\ \"mc2_stderr\": 0.014660092779943103\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/llama-2-70B-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|arc:challenge|25_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hellaswag|10_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T02-16-02.997699.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T02-16-02.997699.parquet'
- config_name: results
data_files:
- split: 2023_09_12T02_16_02.997699
path:
- results_2023-09-12T02-16-02.997699.parquet
- split: latest
path:
- results_2023-09-12T02-16-02.997699.parquet
---
# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/llama-2-70B-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70B-chat](https://huggingface.co/quantumaikr/llama-2-70B-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T02:16:02.997699](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat/blob/main/results_2023-09-12T02-16-02.997699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6907220885493215,
"acc_stderr": 0.03128215576907498,
"acc_norm": 0.6945100749974468,
"acc_norm_stderr": 0.03125451504899436,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5731438157700953,
"mc2_stderr": 0.014660092779943103
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839157,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.6809400517825135,
"acc_stderr": 0.0046515972099930875,
"acc_norm": 0.869448317068313,
"acc_norm_stderr": 0.003362208481557298
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334333,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230172,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230172
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.01332134844761176,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.01332134844761176
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573975,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093448,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093448
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5251396648044693,
"acc_stderr": 0.01670135084268263,
"acc_norm": 0.5251396648044693,
"acc_norm_stderr": 0.01670135084268263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958154,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958154
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5664928292046936,
"acc_stderr": 0.01265681038398397,
"acc_norm": 0.5664928292046936,
"acc_norm_stderr": 0.01265681038398397
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5731438157700953,
"mc2_stderr": 0.014660092779943103
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lilgatouwu/microsoftexcel | 2023-09-12T02:22:19.000Z | [
"region:us"
] | lilgatouwu | null | null | null | 0 | 0 | Entry not found |
doug2123/blog | 2023-09-12T02:20:39.000Z | [
"region:us"
] | doug2123 | null | null | null | 0 | 0 | Entry not found |
nmotlagh/aslg_subset | 2023-09-12T02:47:23.000Z | [
"license:cc",
"region:us"
] | nmotlagh | null | null | null | 0 | 0 | ---
license: cc
---
|
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4 | 2023-09-12T03:06:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikivis/gpt2-large-lora-stf4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T03:05:07.244584](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-09-12T03-05-07.244584.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2564104667208077,\n\
\ \"acc_stderr\": 0.03165697552305268,\n \"acc_norm\": 0.2583419996806525,\n\
\ \"acc_norm_stderr\": 0.031673850610715315,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.40843161308228243,\n\
\ \"mc2_stderr\": 0.014469374815397062\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22013651877133106,\n \"acc_stderr\": 0.012108124883460976,\n\
\ \"acc_norm\": 0.2687713310580205,\n \"acc_norm_stderr\": 0.012955065963710679\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3564031069508066,\n\
\ \"acc_stderr\": 0.0047795744027713865,\n \"acc_norm\": 0.4217287392949612,\n\
\ \"acc_norm_stderr\": 0.004928263494616731\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310049,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310049\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261114,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261114\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124252,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124252\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n\
\ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.3096774193548387,\n\
\ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868407,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868407\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2828282828282828,\n \"acc_stderr\": 0.032087795587867514,\n \"\
acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.032087795587867514\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722127992,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722127992\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063434,\n \
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27522935779816515,\n \"acc_stderr\": 0.019149093743155196,\n \"\
acc_norm\": 0.27522935779816515,\n \"acc_norm_stderr\": 0.019149093743155196\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n\
\ \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.21076233183856502,\n\
\ \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752597,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752597\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.025819233256483703,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.025819233256483703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24010217113665389,\n\
\ \"acc_stderr\": 0.015274685213734197,\n \"acc_norm\": 0.24010217113665389,\n\
\ \"acc_norm_stderr\": 0.015274685213734197\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841286,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841286\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934102,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930901996,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930901996\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2542372881355932,\n\
\ \"acc_stderr\": 0.01112112900784067,\n \"acc_norm\": 0.2542372881355932,\n\
\ \"acc_norm_stderr\": 0.01112112900784067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.02472311040767708,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.02472311040767708\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21568627450980393,\n \"acc_stderr\": 0.016639319350313264,\n \
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.016639319350313264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31343283582089554,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.31343283582089554,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\
\ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\
\ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.40843161308228243,\n\
\ \"mc2_stderr\": 0.014469374815397062\n }\n}\n```"
repo_url: https://huggingface.co/Mikivis/gpt2-large-lora-stf4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet'
- config_name: results
data_files:
- split: 2023_09_12T03_05_07.244584
path:
- results_2023-09-12T03-05-07.244584.parquet
- split: latest
path:
- results_2023-09-12T03-05-07.244584.parquet
---
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-stf4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T03:05:07.244584](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-09-12T03-05-07.244584.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2564104667208077,
"acc_stderr": 0.03165697552305268,
"acc_norm": 0.2583419996806525,
"acc_norm_stderr": 0.031673850610715315,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.40843161308228243,
"mc2_stderr": 0.014469374815397062
},
"harness|arc:challenge|25": {
"acc": 0.22013651877133106,
"acc_stderr": 0.012108124883460976,
"acc_norm": 0.2687713310580205,
"acc_norm_stderr": 0.012955065963710679
},
"harness|hellaswag|10": {
"acc": 0.3564031069508066,
"acc_stderr": 0.0047795744027713865,
"acc_norm": 0.4217287392949612,
"acc_norm_stderr": 0.004928263494616731
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310049,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310049
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261114,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261114
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124252,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124252
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868407,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868407
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722127992,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722127992
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27522935779816515,
"acc_stderr": 0.019149093743155196,
"acc_norm": 0.27522935779816515,
"acc_norm_stderr": 0.019149093743155196
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.027373095500540193,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.027373095500540193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483703,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24010217113665389,
"acc_stderr": 0.015274685213734197,
"acc_norm": 0.24010217113665389,
"acc_norm_stderr": 0.015274685213734197
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841286,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841286
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934102,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930901996,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930901996
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2542372881355932,
"acc_stderr": 0.01112112900784067,
"acc_norm": 0.2542372881355932,
"acc_norm_stderr": 0.01112112900784067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.02472311040767708,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.02472311040767708
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31343283582089554,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.31343283582089554,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.40843161308228243,
"mc2_stderr": 0.014469374815397062
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Riiid__sheep-duck-llama-2 | 2023-09-19T02:43:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Riiid/sheep-duck-llama-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Riiid/sheep-duck-llama-2](https://huggingface.co/Riiid/sheep-duck-llama-2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Riiid__sheep-duck-llama-2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-19T02:41:38.567550](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2/blob/main/results_2023-09-19T02-41-38.567550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7074787526637408,\n\
\ \"acc_stderr\": 0.030842770794867788,\n \"acc_norm\": 0.7112713043078007,\n\
\ \"acc_norm_stderr\": 0.03081173438001915,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6379733867215786,\n\
\ \"mc2_stderr\": 0.014804542452694204\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.013572657703084948,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6915952997410875,\n\
\ \"acc_stderr\": 0.0046089078729577085,\n \"acc_norm\": 0.8778131846245768,\n\
\ \"acc_norm_stderr\": 0.003268321260913631\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.02674989977124121,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.02674989977124121\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.03036358219723817,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.03036358219723817\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.02570765861415495,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.02570765861415495\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216763,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216763\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515368,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515368\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216055,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216055\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6245810055865921,\n\
\ \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.6245810055865921,\n\
\ \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982477,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982477\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n\
\ \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.7909967845659164,\n\
\ \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5780141843971631,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5867014341590613,\n\
\ \"acc_stderr\": 0.012576779494860076,\n \"acc_norm\": 0.5867014341590613,\n\
\ \"acc_norm_stderr\": 0.012576779494860076\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018533,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018533\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6379733867215786,\n\
\ \"mc2_stderr\": 0.014804542452694204\n }\n}\n```"
repo_url: https://huggingface.co/Riiid/sheep-duck-llama-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|arc:challenge|25_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hellaswag|10_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-15-20.917267.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-15-20.917267.parquet'
- split: 2023_09_19T02_41_38.567550
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T02-41-38.567550.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T02-41-38.567550.parquet'
- config_name: results
data_files:
- split: 2023_09_12T04_15_20.917267
path:
- results_2023-09-12T04-15-20.917267.parquet
- split: 2023_09_19T02_41_38.567550
path:
- results_2023-09-19T02-41-38.567550.parquet
- split: latest
path:
- results_2023-09-19T02-41-38.567550.parquet
---
# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Riiid/sheep-duck-llama-2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Riiid/sheep-duck-llama-2](https://huggingface.co/Riiid/sheep-duck-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Riiid__sheep-duck-llama-2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T02:41:38.567550](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2/blob/main/results_2023-09-19T02-41-38.567550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7074787526637408,
"acc_stderr": 0.030842770794867788,
"acc_norm": 0.7112713043078007,
"acc_norm_stderr": 0.03081173438001915,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6379733867215786,
"mc2_stderr": 0.014804542452694204
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.013572657703084948,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.6915952997410875,
"acc_stderr": 0.0046089078729577085,
"acc_norm": 0.8778131846245768,
"acc_norm_stderr": 0.003268321260913631
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.02674989977124121,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.02674989977124121
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.02570765861415495,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.02570765861415495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216763,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216763
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.0284934650910286,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.0284934650910286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515368,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515368
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371037,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371037
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216055,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216055
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6245810055865921,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.6245810055865921,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982477,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5780141843971631,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.5780141843971631,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5867014341590613,
"acc_stderr": 0.012576779494860076,
"acc_norm": 0.5867014341590613,
"acc_norm_stderr": 0.012576779494860076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803403,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018533,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018533
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6379733867215786,
"mc2_stderr": 0.014804542452694204
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bongo2112/moodewji-SDxl-output-images | 2023-09-12T10:40:58.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
khalidalt/xlsum_clm | 2023-09-12T04:36:27.000Z | [
"region:us"
] | khalidalt | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: gem_id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: target
dtype: string
- name: references
list: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 217986489
num_examples: 37519
download_size: 107517494
dataset_size: 217986489
---
# Dataset Card for "xlsum_clm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics | 2023-09-12T04:48:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of behnamsh/gpt2_platypus-camel_physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T04:47:32.538128](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-09-12T04-47-32.538128.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25809627898446486,\n\
\ \"acc_stderr\": 0.0314782253560422,\n \"acc_norm\": 0.25906413796586775,\n\
\ \"acc_norm_stderr\": 0.0314927074145284,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3895405171510521,\n\
\ \"mc2_stderr\": 0.014756538274909399\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19112627986348124,\n \"acc_stderr\": 0.011490055292778587,\n\
\ \"acc_norm\": 0.22781569965870307,\n \"acc_norm_stderr\": 0.012256708602326916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29197371041625175,\n\
\ \"acc_stderr\": 0.004537410615572942,\n \"acc_norm\": 0.3123879705238,\n\
\ \"acc_norm_stderr\": 0.004625198756710242\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.0264803571798957,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.0264803571798957\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514196,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514196\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3504587155963303,\n \"acc_stderr\": 0.020456077599824457,\n \"\
acc_norm\": 0.3504587155963303,\n \"acc_norm_stderr\": 0.020456077599824457\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604233,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604233\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2320675105485232,\n \"acc_stderr\": 0.027479744550808503,\n\
\ \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.02624113299640728,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.02624113299640728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596919,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596919\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.02645350805404035,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.02645350805404035\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.021855255263421802,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.021855255263421802\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.022552447780478022,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.022552447780478022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460987,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441907,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441907\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3895405171510521,\n\
\ \"mc2_stderr\": 0.014756538274909399\n }\n}\n```"
repo_url: https://huggingface.co/behnamsh/gpt2_platypus-camel_physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet'
- config_name: results
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- results_2023-09-12T04-47-32.538128.parquet
- split: latest
path:
- results_2023-09-12T04-47-32.538128.parquet
---
# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/behnamsh/gpt2_platypus-camel_physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T04:47:32.538128](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-09-12T04-47-32.538128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25809627898446486,
"acc_stderr": 0.0314782253560422,
"acc_norm": 0.25906413796586775,
"acc_norm_stderr": 0.0314927074145284,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3895405171510521,
"mc2_stderr": 0.014756538274909399
},
"harness|arc:challenge|25": {
"acc": 0.19112627986348124,
"acc_stderr": 0.011490055292778587,
"acc_norm": 0.22781569965870307,
"acc_norm_stderr": 0.012256708602326916
},
"harness|hellaswag|10": {
"acc": 0.29197371041625175,
"acc_stderr": 0.004537410615572942,
"acc_norm": 0.3123879705238,
"acc_norm_stderr": 0.004625198756710242
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.0264803571798957,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.0264803571798957
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514196,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514196
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3504587155963303,
"acc_stderr": 0.020456077599824457,
"acc_norm": 0.3504587155963303,
"acc_norm_stderr": 0.020456077599824457
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604233,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604233
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.02624113299640728,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.02624113299640728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596919,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596919
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02645350805404035,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02645350805404035
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.021855255263421802,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.021855255263421802
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478022,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460987,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441907,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3895405171510521,
"mc2_stderr": 0.014756538274909399
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus | 2023-09-12T04:56:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/llama-2-13b-chat-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T04:54:55.763898](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-09-12T04-54-55.763898.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5445505675467086,\n\
\ \"acc_stderr\": 0.03448981086978501,\n \"acc_norm\": 0.548791314862313,\n\
\ \"acc_norm_stderr\": 0.03447305533172637,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.46229050042733816,\n\
\ \"mc2_stderr\": 0.014768134860028896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.014568245550296356\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6008763194582752,\n\
\ \"acc_stderr\": 0.004887174080003032,\n \"acc_norm\": 0.8067118103963354,\n\
\ \"acc_norm_stderr\": 0.003940700084503099\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699947,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699947\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.02748054188795359,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.02748054188795359\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719198,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155203,\n \"\
acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842555,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842555\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.015671006009339586,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.015671006009339586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.02653818910470548,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.02653818910470548\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n\
\ \"acc_stderr\": 0.0155955202941474,\n \"acc_norm\": 0.3195530726256983,\n\
\ \"acc_norm_stderr\": 0.0155955202941474\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415012,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415012\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38070404172099087,\n\
\ \"acc_stderr\": 0.012401430654645888,\n \"acc_norm\": 0.38070404172099087,\n\
\ \"acc_norm_stderr\": 0.012401430654645888\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.46229050042733816,\n\
\ \"mc2_stderr\": 0.014768134860028896\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/llama-2-13b-chat-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet'
- config_name: results
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- results_2023-09-12T04-54-55.763898.parquet
- split: latest
path:
- results_2023-09-12T04-54-55.763898.parquet
---
# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/llama-2-13b-chat-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T04:54:55.763898](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-09-12T04-54-55.763898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5445505675467086,
"acc_stderr": 0.03448981086978501,
"acc_norm": 0.548791314862313,
"acc_norm_stderr": 0.03447305533172637,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.46229050042733816,
"mc2_stderr": 0.014768134860028896
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.014568245550296356
},
"harness|hellaswag|10": {
"acc": 0.6008763194582752,
"acc_stderr": 0.004887174080003032,
"acc_norm": 0.8067118103963354,
"acc_norm_stderr": 0.003940700084503099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699947,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.02748054188795359,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.02748054188795359
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719198,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.019149093743155203,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.019149093743155203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842555,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842555
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.015671006009339586,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.015671006009339586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.02653818910470548,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.02653818910470548
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3195530726256983,
"acc_stderr": 0.0155955202941474,
"acc_norm": 0.3195530726256983,
"acc_norm_stderr": 0.0155955202941474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415012,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415012
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38070404172099087,
"acc_stderr": 0.012401430654645888,
"acc_norm": 0.38070404172099087,
"acc_norm_stderr": 0.012401430654645888
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.46229050042733816,
"mc2_stderr": 0.014768134860028896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SlookUP/ChatLawAll | 2023-09-12T05:37:48.000Z | [
"license:openrail",
"region:us"
] | SlookUP | null | null | null | 0 | 0 | ---
license: openrail
---
|
botp/LinkSoul-instruction_merge_set | 2023-09-12T05:39:45.000Z | [
"region:us"
] | botp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 13444870155
num_examples: 10077297
download_size: 3542585235
dataset_size: 13444870155
duplicated_from: LinkSoul/instruction_merge_set
---
# Dataset Card for "instruction_merge_set"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yjching/sas-documentation-v1 | 2023-09-12T06:05:11.000Z | [
"region:us"
] | yjching | null | null | null | 0 | 0 | Entry not found |
bupt/LawDataset-BUPT | 2023-09-14T06:55:15.000Z | [
"size_categories:1M<n<10M",
"language:zh",
"legal",
"region:us"
] | bupt | null | null | null | 7 | 0 | ---
language:
- zh
tags:
- legal
pretty_name: LawDataset-BPUT
size_categories:
- 1M<n<10M
---
## LawDataset-BUPT ⚖️
Here is the full data from the Legal LLM project, from which we hope to build a high quality dataset.
Here's our [github project page](https://github.com/KLGR123/LegalLLM-BUPT).
If you want to make any contribution, please contact me QQ 2248157602.
### Data Source
Our data mainly comes from
- CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes
- Tigerbot-law-plugin 55k laws provision data with 11 classes
- Wenshu_ms_dataset 45k law judgements data
- Lexilaw
- LawGPT-zh 52k QA data
- Lawyer_LLAMA law exam and instruction data
- hualv_webste_QA 20k law QA data
- baidu_zhidao_law_QA 36k law QA data
- BELLE general dataset 1.5M
For BELLE dataset and models, please download directly at [BELLE huggingface page](https://huggingface.co/datasets/BELLE-2/train_3.5M_CN_With_Category).
### Data Statistics
So far the dataset size is around
- Law QA data size: ~310k
- Law provision data size: ~55k
- Law judgement data size: ~45k
- General data size: ~1500k
### Data Fields
You can check the different data field for each source data.
Wenshu_ms_dataset 45k law judgements data
```
{
"Case": "王某甲与辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一审民事判决书",
"CaseId": "7abb676880254ca79c34a90e0101bc8e",
"CaseProc": "民事一审",
"CaseRecord": "原告王某甲与被告辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一案,本院于2018年4月26日受理后,依法由审判员雷凯独任审判,公开开庭进行了审理。原告王某甲与被告辽宁古田房地产有限公司的委托代理人李某、刘某某到庭参加诉讼。本案现已审理终结",
"CaseType": "民事案件",
"JudgeAccusation": "原告王某甲诉称:原告原住大东区XX,2009年动迁至2014年回迁,至今被告没给原告房屋补助款。原告多次向被告主张房屋补助款,被告总是说没钱等等再等等。后来被告用这笔款给原告折抵五年物业费(从2015.1.1至2019.12.31),剩余房屋补助费3万多,到现在一直没解决,故起诉至法院。请求法院判令1、被告给付原告房屋拆迁款48000元;2、起诉费由被告承担。\n被告辽宁古田房地产有限公司辩称:针对原告诉讼请求48000元,被告对此不予认可,原、被告双方于2016年9月21日签订了协议书一份,对双方拆迁安置补助费的具体数额进行了重新确认,顶5年物业费后,尚欠安置费33828元。现原告诉讼请求48000元无法律依据,应按双方签订的协议书继续履行,该协议书系双方真实意思表示,具有法律效力。\n经审理查明:2008年7月25日,原被告签订城市房屋拆迁补偿安置协议。2016年9月21日,原告与被告签订协议书,该协议约定逾期安置补助费为48000元,原被告双方同意按百分之八十即38400元进行全部抵顶。其中4572元抵顶原告房屋五年的物业费(从2015年1月1日至2019年12月31日期间),剩余33828元待被告资金充足时解决。原告在庭审中自述从2015年至今没有缴纳过物业费。\n上述事实,有城市房屋拆迁补偿安置协议、协议书等证据及原被告陈述,经开庭质证,本院予以确认,在卷佐证",
"JudgeReason": "本院认为:2016年9月21日,原告与被告签订协议书系双方真实的意思表示,内容不违反法律规定,合法有效,双方均应遵守。在该协议中,原被告协商一致在抵顶五年的物业费后,被告尚欠原告逾期安置补助费33828元,被告至今没有给付原告,故被告应当给付原告逾期安置补助费33828元。\n综上所述,根据《中华人民共和国合同法》第四十四条之规定,判决如下",
"JudgeResult": "一、被告辽宁古田房地产有限公司于本判决生效后十日内给付原告王某甲逾期安置补助费33828元;\n二、驳回原告王某甲的其他诉讼请求。\n如被告未按本判决所指定的期限履行给付义务,则应当依照《中华人民共和国民事诉讼法》第二百五十三条之规定,加倍支付迟延履行期间的债务利息。\n案件受理费1000元,减半收取500元,由原告王某甲负担177元,由被告辽宁古田房地产有限公司负担323元。\n如不服本判决,可在判决书送达之日起15日内向本院递交上诉状,并按对方当事人的人数提出副本,交纳上诉案件受理费,上诉于辽宁省沈阳市中级人民法院。如上诉期满后7日内未交纳上诉案件受理费,按自动撤回上诉处理",
"Keywords": [
"给付"
],
"Parties": [
{
"NameText": "王某甲",
"Name": "王某甲",
"LegalEntity": "Person",
"Prop": "原告"
},
{
"NameText": "辽宁古田房地产有限公司",
"Name": "辽宁古田房地产有限公司",
"LegalEntity": "Corporation",
"Prop": "被告"
}
],
"Category": {
"cat_1": "房地产纠纷",
"cat_2": "房产纠纷"
}
}
```
Tigerbot-law-plugin 55k laws provision data with 11 classes
```
{"type": "宪法", "title": "中华人民共和国宪法", "chapter1": "第一章 总纲", "content": "第六条 中华人民共和国的社会主义经济制度的基础是生产资料的社会主义公有制,即全民所有制和劳动群众集体所有制。社会主义公有制消灭人剥削人的制度,实行各尽所能、按劳分配的原则。\n国家在社会主义初级阶段,坚持公有制为主体、多种所有制经济共同发展的基本经济制度,坚持按劳分配为主体、多种分配方式并存的分配制度。", "chapter2": "", "chapter3": ""}
```
baidu_zhidao_law_QA 36k law QA data
```
title,question,reply,is_best
在法律中定金与订金的区别订金和定金哪个受,,“定金”是指当事人约定由一方向对方给付的,作为债权担保的一定数额的货币,它属于一种法律上的担保方式,目的在于促使债务人履行债务,保障债权人的债权得以实现。签合同时,对定金必需以书面形式进行约定,同时还应约定定金的数额和交付期限。给付定金一方如果不履行债务,无权要求另一方返还定金;接受定金的一方如果不履行债务,需向另一方双倍返还债务。债务人履行债务后,依照约定,定金应抵作价款或者收回。而“订金”目前我国法律没有明确规定,它不具备定金所具有的担保性质,可视为“预付款”,当合同不能履行时,除不可抗力外,应根据双方当事人的过错承担违约责任。,1
```
CrimeKgAssistant
```
{
"completion": "根据相关法律规定,未满14岁的人不得驾驶机动车辆,骑行电动车也属于机动车范畴。因此,未成年捏了电动车的刹车,可以认定为违法行为。同时,成年骑电动车的人也应当承担相应的民事责任。",
"prompt": "一14岁未成年骑自行车与一成年骑电动车相撞,T形路,未成年拐弯,未成年捏了刹车的电动车属于机动车吗??"
}
```
JEC-QA
```
{"answer": ["D"], "id": "3_2613", "option_list": {"A": "因未办理收养登记,包某与陈煜之间不存在法律上父子关系", "B": "陈煜作为包某生前抚养且无经济来源的人,可适当分得包某遗产", "C": "陈某的遗产由洪某与陈婴继承,陈煜不能继承", "D": "陈煜既可以继承陈某的遗产,也可以继承包某的遗产"}, "statement": "陈某与潘某离婚后,潘某带着2岁的儿子陈煜改嫁包某。陈某、潘某、包某三人订立收养协议,陈煜由包某收养,今后一切与陈某概无关系,但未办理收养登记。5年后,潘某与包某生下一女,取名包红。陈某离婚后,与洪某结婚,生女取名陈婴。几年后,陈某、包某相继去世。下列说法中正确的是:", "type": "1"}
```
|
runMark/xxxxd | 2023-09-12T06:06:07.000Z | [
"region:us"
] | runMark | null | null | null | 0 | 0 | Entry not found |
mbk0asis/test_data | 2023-09-12T06:13:57.000Z | [
"license:openrail",
"region:us"
] | mbk0asis | null | null | null | 0 | 0 | ---
license: openrail
---
|
DominikLindorfer/SQL-LLaMA | 2023-09-12T06:20:36.000Z | [
"region:us"
] | DominikLindorfer | null | null | null | 0 | 0 | Entry not found |
heangborin/roofGoogleSatelliteKh | 2023-09-12T06:36:52.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | heangborin | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B | 2023-09-12T06:47:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-1.2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T06:46:37.023580](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-09-12T06-46-37.023580.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.564241176611762,\n\
\ \"acc_stderr\": 0.03447182405110475,\n \"acc_norm\": 0.568056168282022,\n\
\ \"acc_norm_stderr\": 0.03445031901495072,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5031960610653523,\n\
\ \"mc2_stderr\": 0.0155637939515657\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.646584345747859,\n\
\ \"acc_stderr\": 0.004770534055841052,\n \"acc_norm\": 0.8366859191396137,\n\
\ \"acc_norm_stderr\": 0.0036889652317335228\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.02989060968628664,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.02989060968628664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762606,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762606\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719683,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719683\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47150837988826816,\n\
\ \"acc_stderr\": 0.016695329746015793,\n \"acc_norm\": 0.47150837988826816,\n\
\ \"acc_norm_stderr\": 0.016695329746015793\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n\
\ \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284062,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5031960610653523,\n\
\ \"mc2_stderr\": 0.0155637939515657\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-1.2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet'
- config_name: results
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- results_2023-09-12T06-46-37.023580.parquet
- split: latest
path:
- results_2023-09-12T06-46-37.023580.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-1.2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T06:46:37.023580](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-09-12T06-46-37.023580.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.564241176611762,
"acc_stderr": 0.03447182405110475,
"acc_norm": 0.568056168282022,
"acc_norm_stderr": 0.03445031901495072,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5031960610653523,
"mc2_stderr": 0.0155637939515657
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670728
},
"harness|hellaswag|10": {
"acc": 0.646584345747859,
"acc_stderr": 0.004770534055841052,
"acc_norm": 0.8366859191396137,
"acc_norm_stderr": 0.0036889652317335228
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.02989060968628664,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.02989060968628664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762606,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762606
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719683,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719683
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47150837988826816,
"acc_stderr": 0.016695329746015793,
"acc_norm": 0.47150837988826816,
"acc_norm_stderr": 0.016695329746015793
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5895061728395061,
"acc_stderr": 0.027371350925124764,
"acc_norm": 0.5895061728395061,
"acc_norm_stderr": 0.027371350925124764
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284062,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5031960610653523,
"mc2_stderr": 0.0155637939515657
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
michelleyunun/animal_control_register | 2023-09-12T06:47:35.000Z | [
"region:us"
] | michelleyunun | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0 | 2023-09-12T06:54:10.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2057200000
num_examples: 100000
- name: validation
num_bytes: 205720000
num_examples: 10000
download_size: 257403365
dataset_size: 2262920000
---
# Dataset Card for "autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SichaoHu/small_dataset_for_testing | 2023-09-12T07:08:36.000Z | [
"license:apache-2.0",
"region:us"
] | SichaoHu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Falah/marble_prompts | 2023-09-12T07:45:44.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3029313
num_examples: 10000
download_size: 380891
dataset_size: 3029313
---
# Dataset Card for "marble_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marasama/nva-Northwemko | 2023-09-12T07:12:27.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
Lekhanakraj/valorant | 2023-09-13T16:36:27.000Z | [
"region:us"
] | Lekhanakraj | null | null | null | 0 | 0 | Entry not found |
DmitrMakeev/ssk-tunel | 2023-09-13T11:05:40.000Z | [
"license:openrail",
"region:us"
] | DmitrMakeev | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 | 2023-09-12T08:26:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T08:25:08.540326](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096/blob/main/results_2023-09-12T08-25-08.540326.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2535285149165662,\n\
\ \"acc_stderr\": 0.03140221446208649,\n \"acc_norm\": 0.25600417602807973,\n\
\ \"acc_norm_stderr\": 0.031408658638698454,\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.349588061414408,\n\
\ \"mc2_stderr\": 0.013546330859396825\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2841296928327645,\n \"acc_stderr\": 0.013179442447653887,\n\
\ \"acc_norm\": 0.30631399317406144,\n \"acc_norm_stderr\": 0.01347058441727651\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40240987851025695,\n\
\ \"acc_stderr\": 0.004893814890208313,\n \"acc_norm\": 0.5262895837482573,\n\
\ \"acc_norm_stderr\": 0.00498287934069141\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111835,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111835\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.17543859649122806,\n\
\ \"acc_stderr\": 0.03577954813948368,\n \"acc_norm\": 0.17543859649122806,\n\
\ \"acc_norm_stderr\": 0.03577954813948368\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.026552207828215282,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.026552207828215282\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463175,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463175\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958927,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.033742355504256936,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.033742355504256936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26238532110091745,\n \"acc_stderr\": 0.018861885021534724,\n \"\
acc_norm\": 0.26238532110091745,\n \"acc_norm_stderr\": 0.018861885021534724\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604236,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753378,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753378\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2645739910313901,\n\
\ \"acc_stderr\": 0.02960510321703832,\n \"acc_norm\": 0.2645739910313901,\n\
\ \"acc_norm_stderr\": 0.02960510321703832\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n\
\ \"acc_stderr\": 0.030118210106942638,\n \"acc_norm\": 0.3034188034188034,\n\
\ \"acc_norm_stderr\": 0.030118210106942638\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\
\ \"acc_stderr\": 0.015302380123542075,\n \"acc_norm\": 0.2413793103448276,\n\
\ \"acc_norm_stderr\": 0.015302380123542075\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874044,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874044\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574898,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574898\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.011285033165551265,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.011285033165551265\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559373,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559373\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28594771241830064,\n \"acc_stderr\": 0.018280485072954683,\n \
\ \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.018280485072954683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.349588061414408,\n\
\ \"mc2_stderr\": 0.013546330859396825\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-25-08.540326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-25-08.540326.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-25-08.540326.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-25-08.540326.parquet'
- config_name: results
data_files:
- split: 2023_09_12T08_25_08.540326
path:
- results_2023-09-12T08-25-08.540326.parquet
- split: latest
path:
- results_2023-09-12T08-25-08.540326.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T08:25:08.540326](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096/blob/main/results_2023-09-12T08-25-08.540326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2535285149165662,
"acc_stderr": 0.03140221446208649,
"acc_norm": 0.25600417602807973,
"acc_norm_stderr": 0.031408658638698454,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.349588061414408,
"mc2_stderr": 0.013546330859396825
},
"harness|arc:challenge|25": {
"acc": 0.2841296928327645,
"acc_stderr": 0.013179442447653887,
"acc_norm": 0.30631399317406144,
"acc_norm_stderr": 0.01347058441727651
},
"harness|hellaswag|10": {
"acc": 0.40240987851025695,
"acc_stderr": 0.004893814890208313,
"acc_norm": 0.5262895837482573,
"acc_norm_stderr": 0.00498287934069141
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111835,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111835
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.03577954813948368,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.03577954813948368
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916648,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916648
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463175,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463175
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958927,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.033742355504256936,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.033742355504256936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26238532110091745,
"acc_stderr": 0.018861885021534724,
"acc_norm": 0.26238532110091745,
"acc_norm_stderr": 0.018861885021534724
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604236,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753378,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753378
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2645739910313901,
"acc_stderr": 0.02960510321703832,
"acc_norm": 0.2645739910313901,
"acc_norm_stderr": 0.02960510321703832
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.030118210106942638,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.030118210106942638
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.015302380123542075,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.015302380123542075
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.023357365785874044,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.023357365785874044
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574898,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574898
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.011285033165551265,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.011285033165551265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559373,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559373
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28594771241830064,
"acc_stderr": 0.018280485072954683,
"acc_norm": 0.28594771241830064,
"acc_norm_stderr": 0.018280485072954683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288086,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288086
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.349588061414408,
"mc2_stderr": 0.013546330859396825
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B | 2023-09-12T08:40:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-Inverted-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T08:39:09.830541](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B/blob/main/results_2023-09-12T08-39-09.830541.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5593383883718774,\n\
\ \"acc_stderr\": 0.03445294831748893,\n \"acc_norm\": 0.5629656079168563,\n\
\ \"acc_norm_stderr\": 0.03443327926234939,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5121990488422448,\n\
\ \"mc2_stderr\": 0.015675496047240438\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6417048396733719,\n\
\ \"acc_stderr\": 0.00478519504988916,\n \"acc_norm\": 0.8301135232025493,\n\
\ \"acc_norm_stderr\": 0.003747655533754522\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6580645161290323,\n \"acc_stderr\": 0.02698528957655275,\n \"\
acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.02698528957655275\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736232,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736232\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.032363611119519416,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.032363611119519416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395951,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395951\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.02622615860512466,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.02622615860512466\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.01642881191589887,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.01642881191589887\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290275,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290275\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.409387222946545,\n\
\ \"acc_stderr\": 0.01255878089557075,\n \"acc_norm\": 0.409387222946545,\n\
\ \"acc_norm_stderr\": 0.01255878089557075\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087925,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087925\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5121990488422448,\n\
\ \"mc2_stderr\": 0.015675496047240438\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-39-09.830541.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-39-09.830541.parquet'
- config_name: results
data_files:
- split: 2023_09_12T08_39_09.830541
path:
- results_2023-09-12T08-39-09.830541.parquet
- split: latest
path:
- results_2023-09-12T08-39-09.830541.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T08:39:09.830541](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B/blob/main/results_2023-09-12T08-39-09.830541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5593383883718774,
"acc_stderr": 0.03445294831748893,
"acc_norm": 0.5629656079168563,
"acc_norm_stderr": 0.03443327926234939,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5121990488422448,
"mc2_stderr": 0.015675496047240438
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6417048396733719,
"acc_stderr": 0.00478519504988916,
"acc_norm": 0.8301135232025493,
"acc_norm_stderr": 0.003747655533754522
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655275,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736232,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736232
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.032363611119519416,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.032363611119519416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395951,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395951
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.02622615860512466,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.02622615860512466
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.01642881191589887,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.01642881191589887
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290275,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290275
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970473,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.409387222946545,
"acc_stderr": 0.01255878089557075,
"acc_norm": 0.409387222946545,
"acc_norm_stderr": 0.01255878089557075
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087925,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087925
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5121990488422448,
"mc2_stderr": 0.015675496047240438
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DreamscapeAI/checkpoints | 2023-09-12T09:04:37.000Z | [
"region:us"
] | DreamscapeAI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 | 2023-09-12T08:49:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T08:47:54.050773](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098/blob/main/results_2023-09-12T08-47-54.050773.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24586614030004614,\n\
\ \"acc_stderr\": 0.0312058088398341,\n \"acc_norm\": 0.2474604881761793,\n\
\ \"acc_norm_stderr\": 0.031216583955736288,\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.01417959149672834,\n \"mc2\": 0.375692045030614,\n\
\ \"mc2_stderr\": 0.014268705914029215\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23378839590443687,\n \"acc_stderr\": 0.012368225378507156,\n\
\ \"acc_norm\": 0.26023890784982934,\n \"acc_norm_stderr\": 0.01282193022511256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33628759211312487,\n\
\ \"acc_stderr\": 0.0047147308653986445,\n \"acc_norm\": 0.4039036048595897,\n\
\ \"acc_norm_stderr\": 0.004896757857022547\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.04024778401977111,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.04024778401977111\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123408,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123408\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.040925639582376556,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.040925639582376556\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.038924311065187525,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.038924311065187525\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239973,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239973\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994108,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994108\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511783,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511783\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922988,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922988\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909895,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909895\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213796,\n\
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.01720857935778757,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.01720857935778757\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012393,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012393\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2645739910313901,\n\
\ \"acc_stderr\": 0.02960510321703833,\n \"acc_norm\": 0.2645739910313901,\n\
\ \"acc_norm_stderr\": 0.02960510321703833\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.15267175572519084,\n \"acc_stderr\": 0.03154521672005472,\n\
\ \"acc_norm\": 0.15267175572519084,\n \"acc_norm_stderr\": 0.03154521672005472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n\
\ \"acc_stderr\": 0.026655699653922768,\n \"acc_norm\": 0.2094017094017094,\n\
\ \"acc_norm_stderr\": 0.026655699653922768\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n\
\ \"acc_stderr\": 0.015620480263064524,\n \"acc_norm\": 0.2567049808429119,\n\
\ \"acc_norm_stderr\": 0.015620480263064524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210742,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2315112540192926,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.2315112540192926,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827063,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827063\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.02757646862274052,\n\
\ \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.02757646862274052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322263,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2979591836734694,\n \"acc_stderr\": 0.02927956741106567,\n\
\ \"acc_norm\": 0.2979591836734694,\n \"acc_norm_stderr\": 0.02927956741106567\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.01417959149672834,\n \"mc2\": 0.375692045030614,\n\
\ \"mc2_stderr\": 0.014268705914029215\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-47-54.050773.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T08-47-54.050773.parquet'
- config_name: results
data_files:
- split: 2023_09_12T08_47_54.050773
path:
- results_2023-09-12T08-47-54.050773.parquet
- split: latest
path:
- results_2023-09-12T08-47-54.050773.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T08:47:54.050773](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098/blob/main/results_2023-09-12T08-47-54.050773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24586614030004614,
"acc_stderr": 0.0312058088398341,
"acc_norm": 0.2474604881761793,
"acc_norm_stderr": 0.031216583955736288,
"mc1": 0.20685434516523868,
"mc1_stderr": 0.01417959149672834,
"mc2": 0.375692045030614,
"mc2_stderr": 0.014268705914029215
},
"harness|arc:challenge|25": {
"acc": 0.23378839590443687,
"acc_stderr": 0.012368225378507156,
"acc_norm": 0.26023890784982934,
"acc_norm_stderr": 0.01282193022511256
},
"harness|hellaswag|10": {
"acc": 0.33628759211312487,
"acc_stderr": 0.0047147308653986445,
"acc_norm": 0.4039036048595897,
"acc_norm_stderr": 0.004896757857022547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.04024778401977111,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.04024778401977111
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123408,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123408
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.040925639582376556,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.040925639582376556
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187525,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187525
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239973,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239973
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994108,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994108
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511783,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511783
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922988,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922988
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213796,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.01720857935778757,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.01720857935778757
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012393,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012393
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2645739910313901,
"acc_stderr": 0.02960510321703833,
"acc_norm": 0.2645739910313901,
"acc_norm_stderr": 0.02960510321703833
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.15267175572519084,
"acc_stderr": 0.03154521672005472,
"acc_norm": 0.15267175572519084,
"acc_norm_stderr": 0.03154521672005472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2094017094017094,
"acc_stderr": 0.026655699653922768,
"acc_norm": 0.2094017094017094,
"acc_norm_stderr": 0.026655699653922768
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064524,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210742,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2315112540192926,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.2315112540192926,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827063,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.02757646862274052,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.02757646862274052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2979591836734694,
"acc_stderr": 0.02927956741106567,
"acc_norm": 0.2979591836734694,
"acc_norm_stderr": 0.02927956741106567
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20685434516523868,
"mc1_stderr": 0.01417959149672834,
"mc2": 0.375692045030614,
"mc2_stderr": 0.014268705914029215
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hanho/test | 2023-09-14T04:47:10.000Z | [
"license:openrail",
"doi:10.57967/hf/1110",
"region:us"
] | hanho | null | null | null | 0 | 0 | ---
license: openrail
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pokemon
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 43
num_examples: 2
download_size: 0
dataset_size: 43
---
|
SebastianBodza/TextbooksAreAllYouNeed | 2023-09-13T12:43:12.000Z | [
"region:us"
] | SebastianBodza | null | null | null | 0 | 0 | Creating high quality synthetic Datasets:
- [x] Python Textbook with hands-on experience and Code-Exercises -> 42,491 words 285,786 characters
- [x] Test-Driven development with Python -> 66,126 words 478,070 characters
- [x] Torch in Python Textbook with hands-on experience and Code-Exercises -> 60,149 words 473,343 characters
Todo:
- [ ] [*programming language*] hands-on experience and Code-Exercises
- [ ] Test-driven development with [*programming language*] hands-on experience and Code-Exercises
- [ ] [*special lib*] with [*programming language*] Textbook with hands-on experience and Code-Exercises
Ideas:
- programming languages: Javascript, Java, C, C++, C#, GO, (HTML/CSS), SQL, Typescript, (Bash/Shell), PHP, Rust, Kotlin, Ruby
- special lib:
- HTML/CSS: Bootstrap, Tailwind,
- Python: Torch, Tensorflow, Mlflow, FastAPI, Flask,
- Javascript/Typescript: Angular, React,
|
rikdas/madras_dataset | 2023-09-12T08:53:21.000Z | [
"region:us"
] | rikdas | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 22751754.0
num_examples: 10
download_size: 22753302
dataset_size: 22751754.0
---
# Dataset Card for "madras_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
salmonrk/Video | 2023-09-12T09:10:50.000Z | [
"region:us"
] | salmonrk | null | null | null | 0 | 0 | Entry not found |
ahmet1338/turkishReviews-ds-mini | 2023-10-02T19:23:56.000Z | [
"language:tr",
"region:us"
] | ahmet1338 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 896649
dataset_size: 1392332
language:
- tr
---
# Dataset Card for "turkishReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hesha/zste | 2023-09-17T19:21:44.000Z | [
"region:us"
] | hesha | null | null | null | 0 | 0 | Entry not found |
MikeXydas/qr2t_benchmark | 2023-09-12T09:48:03.000Z | [
"license:mit",
"region:us"
] | MikeXydas | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B | 2023-09-12T09:53:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Medusa-1.1-L2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T09:52:20.607338](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-09-12T09-52-20.607338.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5171140921930181,\n\
\ \"acc_stderr\": 0.034904426589943305,\n \"acc_norm\": 0.5209789899515811,\n\
\ \"acc_norm_stderr\": 0.034889078020391644,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.47700972785978124,\n\
\ \"mc2_stderr\": 0.015507351809176053\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.01458677635529432,\n\
\ \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186045\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5935072694682334,\n\
\ \"acc_stderr\": 0.004901747426331734,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.0040949719808920804\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.027621717832907025,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.027621717832907025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6972477064220184,\n\
\ \"acc_stderr\": 0.019698711434756343,\n \"acc_norm\": 0.6972477064220184,\n\
\ \"acc_norm_stderr\": 0.019698711434756343\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n\
\ \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456024,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456024\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392912,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.016225017944770978,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.016225017944770978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.026538189104705477,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.026538189104705477\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859923,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.027402042040269962,\n\
\ \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.027402042040269962\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3956975228161669,\n\
\ \"acc_stderr\": 0.01248929073544901,\n \"acc_norm\": 0.3956975228161669,\n\
\ \"acc_norm_stderr\": 0.01248929073544901\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.47700972785978124,\n\
\ \"mc2_stderr\": 0.015507351809176053\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Medusa-1.1-L2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet'
- config_name: results
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- results_2023-09-12T09-52-20.607338.parquet
- split: latest
path:
- results_2023-09-12T09-52-20.607338.parquet
---
# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Medusa-1.1-L2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T09:52:20.607338](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-09-12T09-52-20.607338.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5171140921930181,
"acc_stderr": 0.034904426589943305,
"acc_norm": 0.5209789899515811,
"acc_norm_stderr": 0.034889078020391644,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.47700972785978124,
"mc2_stderr": 0.015507351809176053
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.01458677635529432,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186045
},
"harness|hellaswag|10": {
"acc": 0.5935072694682334,
"acc_stderr": 0.004901747426331734,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.0040949719808920804
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907025,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6972477064220184,
"acc_stderr": 0.019698711434756343,
"acc_norm": 0.6972477064220184,
"acc_norm_stderr": 0.019698711434756343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456024,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456024
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392912,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.016225017944770978,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.016225017944770978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859923,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.027402042040269962,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.027402042040269962
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3956975228161669,
"acc_stderr": 0.01248929073544901,
"acc_norm": 0.3956975228161669,
"acc_norm_stderr": 0.01248929073544901
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.47700972785978124,
"mc2_stderr": 0.015507351809176053
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gaotong/test2 | 2023-09-12T09:57:11.000Z | [
"region:us"
] | gaotong | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B | 2023-09-12T09:59:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4674367304116677,\n\
\ \"acc_stderr\": 0.035284344124032196,\n \"acc_norm\": 0.4714260290393888,\n\
\ \"acc_norm_stderr\": 0.03526985338617593,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n\
\ \"mc2_stderr\": 0.013785031017759436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.014578995859605802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5902210714997013,\n\
\ \"acc_stderr\": 0.004907877144720015,\n \"acc_norm\": 0.7871937860983867,\n\
\ \"acc_norm_stderr\": 0.004084552641903664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633363,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633363\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104283,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104283\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6330275229357798,\n \"acc_stderr\": 0.020664675659520525,\n \"\
acc_norm\": 0.6330275229357798,\n \"acc_norm_stderr\": 0.020664675659520525\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012383,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012383\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217308,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217308\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n\
\ \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.6462324393358876,\n\
\ \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089775,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902884,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902884\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907904,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n\
\ \"mc2_stderr\": 0.013785031017759436\n }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet'
- config_name: results
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- results_2023-09-12T09-58-38.972064.parquet
- split: latest
path:
- results_2023-09-12T09-58-38.972064.parquet
---
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4674367304116677,
"acc_stderr": 0.035284344124032196,
"acc_norm": 0.4714260290393888,
"acc_norm_stderr": 0.03526985338617593,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237014,
"mc2": 0.3961362396399567,
"mc2_stderr": 0.013785031017759436
},
"harness|arc:challenge|25": {
"acc": 0.4948805460750853,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5332764505119454,
"acc_norm_stderr": 0.014578995859605802
},
"harness|hellaswag|10": {
"acc": 0.5902210714997013,
"acc_stderr": 0.004907877144720015,
"acc_norm": 0.7871937860983867,
"acc_norm_stderr": 0.004084552641903664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633363,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633363
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104283,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104283
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6330275229357798,
"acc_stderr": 0.020664675659520525,
"acc_norm": 0.6330275229357798,
"acc_norm_stderr": 0.020664675659520525
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012383,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012383
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217308,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217308
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6462324393358876,
"acc_stderr": 0.017098184708161903,
"acc_norm": 0.6462324393358876,
"acc_norm_stderr": 0.017098184708161903
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089775,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902884,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902884
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237014,
"mc2": 0.3961362396399567,
"mc2_stderr": 0.013785031017759436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PremierACVKetoGummies/PremierACVKetoGummi | 2023-09-12T09:59:23.000Z | [
"license:afl-3.0",
"region:us"
] | PremierACVKetoGummies | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
CyberHarem/koshimizu_sachiko_idolmastercinderellagirls | 2023-09-17T17:33:45.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koshimizu_sachiko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of koshimizu_sachiko (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 485 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 485 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 485 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 485 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Vasterlord/Dnd | 2023-09-12T10:12:12.000Z | [
"region:us"
] | Vasterlord | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B | 2023-09-12T10:15:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-1.1-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-1.1-L2-13B](https://huggingface.co/Sao10K/Stheno-1.1-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T10:14:13.361250](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B/blob/main/results_2023-09-12T10-14-13.361250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5654206314581072,\n\
\ \"acc_stderr\": 0.03444287477993556,\n \"acc_norm\": 0.5692662344232529,\n\
\ \"acc_norm_stderr\": 0.034421272679073876,\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.016684419859986897,\n \"mc2\": 0.5030405325722809,\n\
\ \"mc2_stderr\": 0.015544005374161975\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.01426963463567073\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6461860187213703,\n\
\ \"acc_stderr\": 0.004771751187407021,\n \"acc_norm\": 0.836387173869747,\n\
\ \"acc_norm_stderr\": 0.0036916784957679765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.02418049716437691,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437691\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.03274287914026868,\n \"acc_norm\"\
: 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017845,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017845\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n\
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510175,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7432950191570882,\n\
\ \"acc_stderr\": 0.015620480263064512,\n \"acc_norm\": 0.7432950191570882,\n\
\ \"acc_norm_stderr\": 0.015620480263064512\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486863,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486863\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.020017629214213094,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.020017629214213094\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.016684419859986897,\n \"mc2\": 0.5030405325722809,\n\
\ \"mc2_stderr\": 0.015544005374161975\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-1.1-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-14-13.361250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-14-13.361250.parquet'
- config_name: results
data_files:
- split: 2023_09_12T10_14_13.361250
path:
- results_2023-09-12T10-14-13.361250.parquet
- split: latest
path:
- results_2023-09-12T10-14-13.361250.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-1.1-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.1-L2-13B](https://huggingface.co/Sao10K/Stheno-1.1-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T10:14:13.361250](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B/blob/main/results_2023-09-12T10-14-13.361250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5654206314581072,
"acc_stderr": 0.03444287477993556,
"acc_norm": 0.5692662344232529,
"acc_norm_stderr": 0.034421272679073876,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986897,
"mc2": 0.5030405325722809,
"mc2_stderr": 0.015544005374161975
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.01426963463567073
},
"harness|hellaswag|10": {
"acc": 0.6461860187213703,
"acc_stderr": 0.004771751187407021,
"acc_norm": 0.836387173869747,
"acc_norm_stderr": 0.0036916784957679765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437691,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437691
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017845,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017845
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510175,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7432950191570882,
"acc_stderr": 0.015620480263064512,
"acc_norm": 0.7432950191570882,
"acc_norm_stderr": 0.015620480263064512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486863,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486863
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.020017629214213094,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.020017629214213094
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986897,
"mc2": 0.5030405325722809,
"mc2_stderr": 0.015544005374161975
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pvduy/evol_code_v2_sample | 2023-09-13T09:12:20.000Z | [
"region:us"
] | pvduy | null | null | null | 0 | 0 | Entry not found |
Jesal14/call_center_speech_dataset | 2023-09-12T10:34:58.000Z | [
"region:us"
] | Jesal14 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r4 | 2023-09-12T10:38:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T10:37:25.589822](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r4/blob/main/results_2023-09-12T10-37-25.589822.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5621132365544714,\n\
\ \"acc_stderr\": 0.034215487004865645,\n \"acc_norm\": 0.5663655682347757,\n\
\ \"acc_norm_stderr\": 0.034195799225212986,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842895,\n \"mc2\": 0.3964818897108659,\n\
\ \"mc2_stderr\": 0.014170133821585919\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.014593487694937738,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182523\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6145190201155148,\n\
\ \"acc_stderr\": 0.004857140410776743,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246658\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493389,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493389\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.01827257581023187,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.01827257581023187\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277906,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277906\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903222,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903222\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630988,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630988\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.03155782816556165,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.03155782816556165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842895,\n \"mc2\": 0.3964818897108659,\n\
\ \"mc2_stderr\": 0.014170133821585919\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-37-25.589822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-37-25.589822.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-37-25.589822.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-37-25.589822.parquet'
- config_name: results
data_files:
- split: 2023_09_12T10_37_25.589822
path:
- results_2023-09-12T10-37-25.589822.parquet
- split: latest
path:
- results_2023-09-12T10-37-25.589822.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T10:37:25.589822](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r4/blob/main/results_2023-09-12T10-37-25.589822.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5621132365544714,
"acc_stderr": 0.034215487004865645,
"acc_norm": 0.5663655682347757,
"acc_norm_stderr": 0.034195799225212986,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842895,
"mc2": 0.3964818897108659,
"mc2_stderr": 0.014170133821585919
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.014593487694937738,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182523
},
"harness|hellaswag|10": {
"acc": 0.6145190201155148,
"acc_stderr": 0.004857140410776743,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246658
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493389,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493389
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560406,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277906,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277906
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903222,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903222
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630988,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630988
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370597,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370597
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842895,
"mc2": 0.3964818897108659,
"mc2_stderr": 0.014170133821585919
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded | 2023-09-12T10:45:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of guardrail/llama-2-7b-guanaco-instruct-sharded
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [guardrail/llama-2-7b-guanaco-instruct-sharded](https://huggingface.co/guardrail/llama-2-7b-guanaco-instruct-sharded)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T10:44:14.063451](https://huggingface.co/datasets/open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded/blob/main/results_2023-09-12T10-44-14.063451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46936353378881585,\n\
\ \"acc_stderr\": 0.035084611682527016,\n \"acc_norm\": 0.4731751498063089,\n\
\ \"acc_norm_stderr\": 0.03507021490987327,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394812,\n \"mc2\": 0.4393102241986315,\n\
\ \"mc2_stderr\": 0.015566601930350163\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843788,\n\
\ \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59699263095001,\n \
\ \"acc_stderr\": 0.004894997736719052,\n \"acc_norm\": 0.7868950408285202,\n\
\ \"acc_norm_stderr\": 0.004086642984916037\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.041666666666666644,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.041666666666666644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056129,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056129\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104284,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104284\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6495412844036698,\n \"acc_stderr\": 0.02045607759982446,\n \"\
acc_norm\": 0.6495412844036698,\n \"acc_norm_stderr\": 0.02045607759982446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.02860595370200424,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.02860595370200424\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.01684117465529572,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.01684117465529572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21564245810055865,\n\
\ \"acc_stderr\": 0.013754835975482351,\n \"acc_norm\": 0.21564245810055865,\n\
\ \"acc_norm_stderr\": 0.013754835975482351\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852387,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852387\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n\
\ \"acc_stderr\": 0.012078563777145562,\n \"acc_norm\": 0.3376792698826597,\n\
\ \"acc_norm_stderr\": 0.012078563777145562\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3639705882352941,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.3639705882352941,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261443,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.035087719298245626,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.035087719298245626\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394812,\n \"mc2\": 0.4393102241986315,\n\
\ \"mc2_stderr\": 0.015566601930350163\n }\n}\n```"
repo_url: https://huggingface.co/guardrail/llama-2-7b-guanaco-instruct-sharded
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-44-14.063451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-44-14.063451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-44-14.063451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T10-44-14.063451.parquet'
- config_name: results
data_files:
- split: 2023_09_12T10_44_14.063451
path:
- results_2023-09-12T10-44-14.063451.parquet
- split: latest
path:
- results_2023-09-12T10-44-14.063451.parquet
---
# Dataset Card for Evaluation run of guardrail/llama-2-7b-guanaco-instruct-sharded
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/guardrail/llama-2-7b-guanaco-instruct-sharded
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [guardrail/llama-2-7b-guanaco-instruct-sharded](https://huggingface.co/guardrail/llama-2-7b-guanaco-instruct-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T10:44:14.063451](https://huggingface.co/datasets/open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded/blob/main/results_2023-09-12T10-44-14.063451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46936353378881585,
"acc_stderr": 0.035084611682527016,
"acc_norm": 0.4731751498063089,
"acc_norm_stderr": 0.03507021490987327,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394812,
"mc2": 0.4393102241986315,
"mc2_stderr": 0.015566601930350163
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.014611199329843788,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.014570144495075581
},
"harness|hellaswag|10": {
"acc": 0.59699263095001,
"acc_stderr": 0.004894997736719052,
"acc_norm": 0.7868950408285202,
"acc_norm_stderr": 0.004086642984916037
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.041666666666666644,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.041666666666666644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056129,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056129
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104284,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104284
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6495412844036698,
"acc_stderr": 0.02045607759982446,
"acc_norm": 0.6495412844036698,
"acc_norm_stderr": 0.02045607759982446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236434,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236434
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.02860595370200424,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.02860595370200424
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.01684117465529572,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.01684117465529572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21564245810055865,
"acc_stderr": 0.013754835975482351,
"acc_norm": 0.21564245810055865,
"acc_norm_stderr": 0.013754835975482351
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852387,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852387
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878638,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145562,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3639705882352941,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.3639705882352941,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261443,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.035087719298245626,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.035087719298245626
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394812,
"mc2": 0.4393102241986315,
"mc2_stderr": 0.015566601930350163
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
malteee/TruckDet1 | 2023-09-12T13:44:59.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: area
sequence: float64
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
- name: id
sequence: int64
splits:
- name: train
num_bytes: 78780289.0
num_examples: 651
download_size: 78781526
dataset_size: 78780289.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TruckDet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
husmin/reportsQA | 2023-09-12T11:22:54.000Z | [
"region:us"
] | husmin | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.