id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B | 2023-08-27T12:37:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/Pythia-Chat-Base-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/Pythia-Chat-Base-7B](https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:40:02.088273](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B/blob/main/results_2023-07-19T16%3A40%3A02.088273.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2796926769457635,\n\
\ \"acc_stderr\": 0.03257368967680694,\n \"acc_norm\": 0.28353072501568105,\n\
\ \"acc_norm_stderr\": 0.03257390222310244,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.34628186450711146,\n\
\ \"mc2_stderr\": 0.014004393578326618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.013944635930726089,\n\
\ \"acc_norm\": 0.40017064846416384,\n \"acc_norm_stderr\": 0.014317197787809176\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5097590121489743,\n\
\ \"acc_stderr\": 0.0049888308841316295,\n \"acc_norm\": 0.6867157936666003,\n\
\ \"acc_norm_stderr\": 0.00462880925848353\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838742,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838742\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537316,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537316\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481006,\n \"\
acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481006\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565318,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.021278393863586282,\n\
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.021278393863586282\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.02851025151234193,\n \
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.02851025151234193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993652,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115072,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115072\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389104,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389104\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n\
\ \"acc_stderr\": 0.0277901770643836,\n \"acc_norm\": 0.21973094170403587,\n\
\ \"acc_norm_stderr\": 0.0277901770643836\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573982,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573982\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531769,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531769\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n\
\ \"acc_stderr\": 0.03046365674734024,\n \"acc_norm\": 0.3162393162393162,\n\
\ \"acc_norm_stderr\": 0.03046365674734024\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.015671006009339565,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.015671006009339565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624732,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624732\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n\
\ \"acc_stderr\": 0.02608270069539965,\n \"acc_norm\": 0.3022508038585209,\n\
\ \"acc_norm_stderr\": 0.02608270069539965\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.024836057868294674,\n\
\ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.024836057868294674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.02657786094330786,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.02657786094330786\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.011371658294311514,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.011371658294311514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3022875816993464,\n \"acc_stderr\": 0.018579232711113884,\n \
\ \"acc_norm\": 0.3022875816993464,\n \"acc_norm_stderr\": 0.018579232711113884\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409214,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409214\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.34628186450711146,\n\
\ \"mc2_stderr\": 0.014004393578326618\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:02.088273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:02.088273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:40:02.088273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:40:02.088273.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_40_02.088273
path:
- results_2023-07-19T16:40:02.088273.parquet
- split: latest
path:
- results_2023-07-19T16:40:02.088273.parquet
---
# Dataset Card for Evaluation run of togethercomputer/Pythia-Chat-Base-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/Pythia-Chat-Base-7B](https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:40:02.088273](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B/blob/main/results_2023-07-19T16%3A40%3A02.088273.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2796926769457635,
"acc_stderr": 0.03257368967680694,
"acc_norm": 0.28353072501568105,
"acc_norm_stderr": 0.03257390222310244,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.34628186450711146,
"mc2_stderr": 0.014004393578326618
},
"harness|arc:challenge|25": {
"acc": 0.3506825938566553,
"acc_stderr": 0.013944635930726089,
"acc_norm": 0.40017064846416384,
"acc_norm_stderr": 0.014317197787809176
},
"harness|hellaswag|10": {
"acc": 0.5097590121489743,
"acc_stderr": 0.0049888308841316295,
"acc_norm": 0.6867157936666003,
"acc_norm_stderr": 0.00462880925848353
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838742,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838742
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537316,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537316
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565318,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.021278393863586282,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.021278393863586282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.02851025151234193,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.02851025151234193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993652,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115072,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115072
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21973094170403587,
"acc_stderr": 0.0277901770643836,
"acc_norm": 0.21973094170403587,
"acc_norm_stderr": 0.0277901770643836
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573982,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573982
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531769,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.03046365674734024,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.03046365674734024
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.015671006009339565,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.015671006009339565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624732,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624732
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.02608270069539965,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.02608270069539965
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.02657786094330786,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.02657786094330786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311514,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3022875816993464,
"acc_stderr": 0.018579232711113884,
"acc_norm": 0.3022875816993464,
"acc_norm_stderr": 0.018579232711113884
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409214,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409214
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.34628186450711146,
"mc2_stderr": 0.014004393578326618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B | 2023-08-27T12:37:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/GPT-NeoXT-Chat-Base-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/GPT-NeoXT-Chat-Base-20B](https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T21:40:44.259947](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B/blob/main/results_2023-07-19T21%3A40%3A44.259947.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.30557516894558684,\n\
\ \"acc_stderr\": 0.03330835238928597,\n \"acc_norm\": 0.3093332728020259,\n\
\ \"acc_norm_stderr\": 0.03330031981226186,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.34509306722168853,\n\
\ \"mc2_stderr\": 0.014157674584376619\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42406143344709896,\n \"acc_stderr\": 0.014441889627464398,\n\
\ \"acc_norm\": 0.4564846416382253,\n \"acc_norm_stderr\": 0.014555949760496437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5509858593905597,\n\
\ \"acc_stderr\": 0.004963771168672083,\n \"acc_norm\": 0.7402907787293368,\n\
\ \"acc_norm_stderr\": 0.0043757889912168476\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849726,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.30943396226415093,\n \"acc_stderr\": 0.028450154794118627,\n\
\ \"acc_norm\": 0.30943396226415093,\n \"acc_norm_stderr\": 0.028450154794118627\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388979,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388979\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3258064516129032,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.3258064516129032,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3787878787878788,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.3787878787878788,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181005,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181005\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.02317740813146594,\n\
\ \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.02317740813146594\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24537037037037038,\n \"acc_stderr\": 0.029346665094372937,\n \"\
acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.029346665094372937\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3480392156862745,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598046,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598046\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4214876033057851,\n \"acc_stderr\": 0.04507732278775094,\n \"\
acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.04507732278775094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824848,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824848\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.02920254015343117,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.02920254015343117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3231162196679438,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.3231162196679438,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.026787453111906535,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.026787453111906535\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3247588424437299,\n\
\ \"acc_stderr\": 0.02659678228769705,\n \"acc_norm\": 0.3247588424437299,\n\
\ \"acc_norm_stderr\": 0.02659678228769705\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.33024691358024694,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.33024691358024694,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2848761408083442,\n\
\ \"acc_stderr\": 0.011527830846369004,\n \"acc_norm\": 0.2848761408083442,\n\
\ \"acc_norm_stderr\": 0.011527830846369004\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559352,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573005,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573005\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.34509306722168853,\n\
\ \"mc2_stderr\": 0.014157674584376619\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:40:44.259947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:40:44.259947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:40:44.259947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:40:44.259947.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_40_44.259947
path:
- results_2023-07-19T21:40:44.259947.parquet
- split: latest
path:
- results_2023-07-19T21:40:44.259947.parquet
---
# Dataset Card for Evaluation run of togethercomputer/GPT-NeoXT-Chat-Base-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/GPT-NeoXT-Chat-Base-20B](https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T21:40:44.259947](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B/blob/main/results_2023-07-19T21%3A40%3A44.259947.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.30557516894558684,
"acc_stderr": 0.03330835238928597,
"acc_norm": 0.3093332728020259,
"acc_norm_stderr": 0.03330031981226186,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.34509306722168853,
"mc2_stderr": 0.014157674584376619
},
"harness|arc:challenge|25": {
"acc": 0.42406143344709896,
"acc_stderr": 0.014441889627464398,
"acc_norm": 0.4564846416382253,
"acc_norm_stderr": 0.014555949760496437
},
"harness|hellaswag|10": {
"acc": 0.5509858593905597,
"acc_stderr": 0.004963771168672083,
"acc_norm": 0.7402907787293368,
"acc_norm_stderr": 0.0043757889912168476
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849726,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30943396226415093,
"acc_stderr": 0.028450154794118627,
"acc_norm": 0.30943396226415093,
"acc_norm_stderr": 0.028450154794118627
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388979,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388979
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3258064516129032,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.3258064516129032,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3787878787878788,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.3787878787878788,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.03458816042181005,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.03458816042181005
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.02317740813146594,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.02317740813146594
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.029346665094372937,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.029346665094372937
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598046,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598046
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.04507732278775094,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.04507732278775094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824848,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824848
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.02920254015343117,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.02920254015343117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3231162196679438,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.3231162196679438,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.026787453111906535,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.026787453111906535
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3247588424437299,
"acc_stderr": 0.02659678228769705,
"acc_norm": 0.3247588424437299,
"acc_norm_stderr": 0.02659678228769705
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33024691358024694,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.33024691358024694,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2848761408083442,
"acc_stderr": 0.011527830846369004,
"acc_norm": 0.2848761408083442,
"acc_norm_stderr": 0.011527830846369004
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559352,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.040693063197213754,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.040693063197213754
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573005,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573005
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.34509306722168853,
"mc2_stderr": 0.014157674584376619
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1 | 2023-08-27T12:37:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-Chat-3B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-Chat-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T15:21:48.380977](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1/blob/main/results_2023-07-19T15%3A21%3A48.380977.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26844377474990455,\n\
\ \"acc_stderr\": 0.03198093633874286,\n \"acc_norm\": 0.2721408115500613,\n\
\ \"acc_norm_stderr\": 0.03197951774489613,\n \"mc1\": 0.21052631578947367,\n\
\ \"mc1_stderr\": 0.014271740645964192,\n \"mc2\": 0.34443339278943025,\n\
\ \"mc2_stderr\": 0.01347443593611675\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.014224250973257168,\n\
\ \"acc_norm\": 0.4283276450511945,\n \"acc_norm_stderr\": 0.014460496367599022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5006970722963553,\n\
\ \"acc_stderr\": 0.00498977656227611,\n \"acc_norm\": 0.6761601274646485,\n\
\ \"acc_norm_stderr\": 0.0046698341309770654\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.027851252973889774,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.027851252973889774\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3838383838383838,\n \"acc_stderr\": 0.03464881675016338,\n \"\
acc_norm\": 0.3838383838383838,\n \"acc_norm_stderr\": 0.03464881675016338\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354094,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354094\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.02160629449464773,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.02160629449464773\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3229357798165138,\n \"acc_stderr\": 0.02004811592341533,\n \"\
acc_norm\": 0.3229357798165138,\n \"acc_norm_stderr\": 0.02004811592341533\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.029886910547626964,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.029886910547626964\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13452914798206278,\n\
\ \"acc_stderr\": 0.022901183761575593,\n \"acc_norm\": 0.13452914798206278,\n\
\ \"acc_norm_stderr\": 0.022901183761575593\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.037709700493470194,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.037709700493470194\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.0445325483632647,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.0445325483632647\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\
\ \"acc_stderr\": 0.014796502622562551,\n \"acc_norm\": 0.2670391061452514,\n\
\ \"acc_norm_stderr\": 0.014796502622562551\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.01131134769063388,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.01131134769063388\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.238562091503268,\n \"acc_stderr\": 0.017242385828779593,\n \"\
acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779593\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505415,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505415\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.02916273841024976,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.02916273841024976\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573005,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573005\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21052631578947367,\n\
\ \"mc1_stderr\": 0.014271740645964192,\n \"mc2\": 0.34443339278943025,\n\
\ \"mc2_stderr\": 0.01347443593611675\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:21:48.380977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:21:48.380977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:21:48.380977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:21:48.380977.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_21_48.380977
path:
- results_2023-07-19T15:21:48.380977.parquet
- split: latest
path:
- results_2023-07-19T15:21:48.380977.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-Chat-3B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-Chat-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T15:21:48.380977](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1/blob/main/results_2023-07-19T15%3A21%3A48.380977.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26844377474990455,
"acc_stderr": 0.03198093633874286,
"acc_norm": 0.2721408115500613,
"acc_norm_stderr": 0.03197951774489613,
"mc1": 0.21052631578947367,
"mc1_stderr": 0.014271740645964192,
"mc2": 0.34443339278943025,
"mc2_stderr": 0.01347443593611675
},
"harness|arc:challenge|25": {
"acc": 0.3856655290102389,
"acc_stderr": 0.014224250973257168,
"acc_norm": 0.4283276450511945,
"acc_norm_stderr": 0.014460496367599022
},
"harness|hellaswag|10": {
"acc": 0.5006970722963553,
"acc_stderr": 0.00498977656227611,
"acc_norm": 0.6761601274646485,
"acc_norm_stderr": 0.0046698341309770654
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073465,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073465
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.30566037735849055,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3838383838383838,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.3838383838383838,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354094,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.02160629449464773,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.02160629449464773
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3229357798165138,
"acc_stderr": 0.02004811592341533,
"acc_norm": 0.3229357798165138,
"acc_norm_stderr": 0.02004811592341533
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.029886910547626964,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.029886910547626964
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13452914798206278,
"acc_stderr": 0.022901183761575593,
"acc_norm": 0.13452914798206278,
"acc_norm_stderr": 0.022901183761575593
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.037709700493470194,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.037709700493470194
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.0445325483632647,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.0445325483632647
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.014796502622562551,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.014796502622562551
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.01131134769063388,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.01131134769063388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779593,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505415,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505415
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.02916273841024976,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.02916273841024976
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573005,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573005
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21052631578947367,
"mc1_stderr": 0.014271740645964192,
"mc2": 0.34443339278943025,
"mc2_stderr": 0.01347443593611675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1 | 2023-08-28T20:42:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 119 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1\"\
,\n\t\"original_mmlu_world_religions_5\",\n\tsplit=\"train\")\n```\n\n## Latest\
\ results\n\nThese are the [latest results from run 2023-08-28T20:41:49.693075](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1/blob/main/results_2023-08-28T20%3A41%3A49.693075.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27223714150630224,\n\
\ \"acc_stderr\": 0.03304797167404924\n },\n \"original|mmlu:abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129\n },\n\
\ \"original|mmlu:anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \
\ \"acc_stderr\": 0.035914440841969694\n },\n \"original|mmlu:astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898\n\
\ },\n \"original|mmlu:business_ethics|5\": {\n \"acc\": 0.23,\n \
\ \"acc_stderr\": 0.04229525846816508\n },\n \"original|mmlu:clinical_knowledge|5\"\
: {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666\n\
\ },\n \"original|mmlu:college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343\n },\n \"original|mmlu:college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316\n },\n\
\ \"original|mmlu:college_computer_science|5\": {\n \"acc\": 0.28,\n \
\ \"acc_stderr\": 0.045126085985421296\n },\n \"original|mmlu:college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127\n },\n\
\ \"original|mmlu:college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995\n },\n \"original|mmlu:college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654\n\
\ },\n \"original|mmlu:computer_security|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505\n },\n \"original|mmlu:conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162466\n\
\ },\n \"original|mmlu:econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367\n },\n \"original|mmlu:electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325\n\
\ },\n \"original|mmlu:elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n\
\ \"acc_stderr\": 0.02278967314577656\n },\n \"original|mmlu:formal_logic|5\"\
: {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302054\n\
\ },\n \"original|mmlu:global_facts|5\": {\n \"acc\": 0.3,\n \
\ \"acc_stderr\": 0.046056618647183814\n },\n \"original|mmlu:high_school_biology|5\"\
: {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.0253781399708852\n\
\ },\n \"original|mmlu:high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n\
\ \"acc_stderr\": 0.031270907132976984\n },\n \"original|mmlu:high_school_computer_science|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234\n },\n\
\ \"original|mmlu:high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n\
\ \"acc_stderr\": 0.03346409881055953\n },\n \"original|mmlu:high_school_geography|5\"\
: {\n \"acc\": 0.3838383838383838,\n \"acc_stderr\": 0.03464881675016338\n\
\ },\n \"original|mmlu:high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674\n\
\ },\n \"original|mmlu:high_school_macroeconomics|5\": {\n \"acc\"\
: 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776823\n },\n \
\ \"original|mmlu:high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.026962424325073828\n },\n \"original|mmlu:high_school_microeconomics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786379\n\
\ },\n \"original|mmlu:high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n\
\ \"acc_stderr\": 0.037345356767871984\n },\n \"original|mmlu:high_school_psychology|5\"\
: {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328\n\
\ },\n \"original|mmlu:high_school_statistics|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.030851992993257013\n },\n \"original|mmlu:high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154\n },\n\
\ \"original|mmlu:high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n\
\ \"acc_stderr\": 0.028458820991460302\n },\n \"original|mmlu:human_aging|5\"\
: {\n \"acc\": 0.13452914798206278,\n \"acc_stderr\": 0.022901183761575593\n\
\ },\n \"original|mmlu:human_sexuality|5\": {\n \"acc\": 0.19083969465648856,\n\
\ \"acc_stderr\": 0.03446513350752597\n },\n \"original|mmlu:international_law|5\"\
: {\n \"acc\": 0.39669421487603307,\n \"acc_stderr\": 0.04465869780531009\n\
\ },\n \"original|mmlu:jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607\n },\n \"original|mmlu:logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354\n\
\ },\n \"original|mmlu:machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629\n },\n \"original|mmlu:management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386\n\
\ },\n \"original|mmlu:marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392912\n },\n \"original|mmlu:medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099\n },\n\
\ \"original|mmlu:miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686936\n },\n \"original|mmlu:moral_disputes|5\"\
: {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.02454761779480383\n\
\ },\n \"original|mmlu:moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290803\n },\n \"original|mmlu:nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716\n\
\ },\n \"original|mmlu:philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612\n },\n \"original|mmlu:prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02492200116888634\n\
\ },\n \"original|mmlu:professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n\
\ \"acc_stderr\": 0.026577860943307857\n },\n \"original|mmlu:professional_law|5\"\
: {\n \"acc\": 0.2685788787483703,\n \"acc_stderr\": 0.011320056629121734\n\
\ },\n \"original|mmlu:professional_medicine|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.029896163033125478\n },\n \"original|mmlu:professional_psychology|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764\n\
\ },\n \"original|mmlu:public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506\n },\n \"original|mmlu:security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.03011642629654061\n\
\ },\n \"original|mmlu:sociology|5\": {\n \"acc\": 0.29850746268656714,\n\
\ \"acc_stderr\": 0.03235743789355042\n },\n \"original|mmlu:us_foreign_policy|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624\n },\n\
\ \"original|mmlu:virology|5\": {\n \"acc\": 0.21084337349397592,\n \
\ \"acc_stderr\": 0.03175554786629921\n },\n \"original|mmlu:world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215\n\
\ }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:56.441864.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:56.441864.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:56.441864.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:41:49.693075.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_41_49.693075
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:41:49.693075.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_11_56.441864
path:
- results_2023-07-19T15:11:56.441864.parquet
- split: 2023_08_28T20_41_49.693075
path:
- results_2023-08-28T20:41:49.693075.parquet
- split: latest
path:
- results_2023-08-28T20:41:49.693075.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 119 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1",
"original_mmlu_world_religions_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T20:41:49.693075](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1/blob/main/results_2023-08-28T20%3A41%3A49.693075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27223714150630224,
"acc_stderr": 0.03304797167404924
},
"original|mmlu:abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129
},
"original|mmlu:anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694
},
"original|mmlu:astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898
},
"original|mmlu:business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508
},
"original|mmlu:clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322666
},
"original|mmlu:college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343
},
"original|mmlu:college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316
},
"original|mmlu:college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296
},
"original|mmlu:college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127
},
"original|mmlu:college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995
},
"original|mmlu:college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654
},
"original|mmlu:computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505
},
"original|mmlu:conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162466
},
"original|mmlu:econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367
},
"original|mmlu:electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325
},
"original|mmlu:elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656
},
"original|mmlu:formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054
},
"original|mmlu:global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814
},
"original|mmlu:high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.0253781399708852
},
"original|mmlu:high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984
},
"original|mmlu:high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234
},
"original|mmlu:high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953
},
"original|mmlu:high_school_geography|5": {
"acc": 0.3838383838383838,
"acc_stderr": 0.03464881675016338
},
"original|mmlu:high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674
},
"original|mmlu:high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.022752388839776823
},
"original|mmlu:high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828
},
"original|mmlu:high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786379
},
"original|mmlu:high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984
},
"original|mmlu:high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328
},
"original|mmlu:high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013
},
"original|mmlu:high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154
},
"original|mmlu:high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302
},
"original|mmlu:human_aging|5": {
"acc": 0.13452914798206278,
"acc_stderr": 0.022901183761575593
},
"original|mmlu:human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752597
},
"original|mmlu:international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.04465869780531009
},
"original|mmlu:jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607
},
"original|mmlu:logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354
},
"original|mmlu:machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629
},
"original|mmlu:management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386
},
"original|mmlu:marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392912
},
"original|mmlu:medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099
},
"original|mmlu:miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686936
},
"original|mmlu:moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.02454761779480383
},
"original|mmlu:moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290803
},
"original|mmlu:nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716
},
"original|mmlu:philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612
},
"original|mmlu:prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02492200116888634
},
"original|mmlu:professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857
},
"original|mmlu:professional_law|5": {
"acc": 0.2685788787483703,
"acc_stderr": 0.011320056629121734
},
"original|mmlu:professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478
},
"original|mmlu:professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764
},
"original|mmlu:public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506
},
"original|mmlu:security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.03011642629654061
},
"original|mmlu:sociology|5": {
"acc": 0.29850746268656714,
"acc_stderr": 0.03235743789355042
},
"original|mmlu:us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624
},
"original|mmlu:virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921
},
"original|mmlu:world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base | 2023-09-08T17:48:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-7B-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-08T17:48:19.912039](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base/blob/main/results_2023-09-08T17-48-19.912039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192694,\n \"f1\": 0.05110738255033561,\n\
\ \"f1_stderr\": 0.0012343063700893503,\n \"acc\": 0.3445825177884037,\n\
\ \"acc_stderr\": 0.008314908287260184\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192694,\n\
\ \"f1\": 0.05110738255033561,\n \"f1_stderr\": 0.0012343063700893503\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723889985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6732438831886346,\n \"acc_stderr\": 0.013181997302131368\n\
\ }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_08T17_48_19.912039
path:
- '**/details_harness|drop|3_2023-09-08T17-48-19.912039.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-08T17-48-19.912039.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_08T17_48_19.912039
path:
- '**/details_harness|gsm8k|5_2023-09-08T17-48-19.912039.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-08T17-48-19.912039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:47.590202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:56:03.209346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:24:47.590202.parquet'
- split: 2023_07_19T10_56_03.209346
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:56:03.209346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:56:03.209346.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_08T17_48_19.912039
path:
- '**/details_harness|winogrande|5_2023-09-08T17-48-19.912039.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-08T17-48-19.912039.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:40:09.683575.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:40:09.683575.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_40_09.683575
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:40:09.683575.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:40:09.683575.parquet'
- config_name: results
data_files:
- split: 2023_07_18T12_24_47.590202
path:
- results_2023-07-18T12:24:47.590202.parquet
- split: 2023_07_19T10_56_03.209346
path:
- results_2023-07-19T10:56:03.209346.parquet
- split: 2023_08_28T20_40_09.683575
path:
- results_2023-08-28T20:40:09.683575.parquet
- split: 2023_09_08T17_48_19.912039
path:
- results_2023-09-08T17-48-19.912039.parquet
- split: latest
path:
- results_2023-09-08T17-48-19.912039.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-7B-Base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-08T17:48:19.912039](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base/blob/main/results_2023-09-08T17-48-19.912039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192694,
"f1": 0.05110738255033561,
"f1_stderr": 0.0012343063700893503,
"acc": 0.3445825177884037,
"acc_stderr": 0.008314908287260184
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192694,
"f1": 0.05110738255033561,
"f1_stderr": 0.0012343063700893503
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723889985
},
"harness|winogrande|5": {
"acc": 0.6732438831886346,
"acc_stderr": 0.013181997302131368
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat | 2023-08-27T12:37:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-7B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T17:30:52.522444](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat/blob/main/results_2023-07-19T17%3A30%3A52.522444.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27619387525296146,\n\
\ \"acc_stderr\": 0.03233610750347649,\n \"acc_norm\": 0.27942852251055406,\n\
\ \"acc_norm_stderr\": 0.032331753456860894,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36094978005416933,\n\
\ \"mc2_stderr\": 0.01546050915660241\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892887,\n\
\ \"acc_norm\": 0.4206484641638225,\n \"acc_norm_stderr\": 0.014426211252508406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5480979884485162,\n\
\ \"acc_stderr\": 0.004966640868083862,\n \"acc_norm\": 0.7082254530969926,\n\
\ \"acc_norm_stderr\": 0.004536500714147989\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238156,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238156\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.038061426873099935,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.038061426873099935\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184766,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184766\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489614,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489614\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.02311936275823229,\n \
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.02311936275823229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29541284403669726,\n\
\ \"acc_stderr\": 0.019560619182975997,\n \"acc_norm\": 0.29541284403669726,\n\
\ \"acc_norm_stderr\": 0.019560619182975997\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491652,\n\
\ \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.04498676320572922,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.04498676320572922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n\
\ \"acc_stderr\": 0.030463656747340254,\n \"acc_norm\": 0.3162393162393162,\n\
\ \"acc_norm_stderr\": 0.030463656747340254\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.30395913154533843,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.30395913154533843,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210728,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210728\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.31189710610932475,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.01092649610203496,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.01092649610203496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244036,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244036\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.01788318813466719,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.01788318813466719\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3567251461988304,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.3567251461988304,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36094978005416933,\n\
\ \"mc2_stderr\": 0.01546050915660241\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:30:52.522444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:30:52.522444.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:30:52.522444.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:30:52.522444.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_30_52.522444
path:
- results_2023-07-19T17:30:52.522444.parquet
- split: latest
path:
- results_2023-07-19T17:30:52.522444.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-7B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T17:30:52.522444](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat/blob/main/results_2023-07-19T17%3A30%3A52.522444.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27619387525296146,
"acc_stderr": 0.03233610750347649,
"acc_norm": 0.27942852251055406,
"acc_norm_stderr": 0.032331753456860894,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36094978005416933,
"mc2_stderr": 0.01546050915660241
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892887,
"acc_norm": 0.4206484641638225,
"acc_norm_stderr": 0.014426211252508406
},
"harness|hellaswag|10": {
"acc": 0.5480979884485162,
"acc_stderr": 0.004966640868083862,
"acc_norm": 0.7082254530969926,
"acc_norm_stderr": 0.004536500714147989
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238156,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238156
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.038061426873099935,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.038061426873099935
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184766,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184766
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489614,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489614
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.02311936275823229,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.02311936275823229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182975997,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182975997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491652,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.04498676320572922,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.04498676320572922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340254,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340254
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.30395913154533843,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.30395913154533843,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210728,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210728
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203496,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244036,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.01788318813466719,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.01788318813466719
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3567251461988304,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.3567251461988304,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36094978005416933,
"mc2_stderr": 0.01546050915660241
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K | 2023-08-27T12:37:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/LLaMA-2-7B-32K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/LLaMA-2-7B-32K](https://huggingface.co/togethercomputer/LLaMA-2-7B-32K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T14:19:55.056276](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K/blob/main/results_2023-08-09T14%3A19%3A55.056276.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4354440020820605,\n\
\ \"acc_stderr\": 0.035153041751986865,\n \"acc_norm\": 0.4395306963568048,\n\
\ \"acc_norm_stderr\": 0.035143043919087395,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.39230194165552396,\n\
\ \"mc2_stderr\": 0.013964844219302998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.47525597269624575,\n \"acc_norm_stderr\": 0.014593487694937738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5612427803226449,\n\
\ \"acc_stderr\": 0.00495220983185658,\n \"acc_norm\": 0.7614021111332404,\n\
\ \"acc_norm_stderr\": 0.004253553044707781\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4490566037735849,\n \"acc_stderr\": 0.030612730713641095,\n\
\ \"acc_norm\": 0.4490566037735849,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.03148955829745529,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.03148955829745529\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4935483870967742,\n \"acc_stderr\": 0.02844163823354051,\n \"\
acc_norm\": 0.4935483870967742,\n \"acc_norm_stderr\": 0.02844163823354051\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"\
acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\"\
: 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844072,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.03120469122515001,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.03120469122515001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5981651376146789,\n \"acc_stderr\": 0.021020106172997016,\n \"\
acc_norm\": 0.5981651376146789,\n \"acc_norm_stderr\": 0.021020106172997016\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.034478911363533815,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.034478911363533815\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6455696202531646,\n \"acc_stderr\": 0.03113730429718581,\n \
\ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.03113730429718581\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6410256410256411,\n\
\ \"acc_stderr\": 0.03142616993791924,\n \"acc_norm\": 0.6410256410256411,\n\
\ \"acc_norm_stderr\": 0.03142616993791924\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6206896551724138,\n\
\ \"acc_stderr\": 0.01735126811754445,\n \"acc_norm\": 0.6206896551724138,\n\
\ \"acc_norm_stderr\": 0.01735126811754445\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.02838425670488304,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02838425670488304\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.02834504586484063,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.02834504586484063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.02776768960683392,\n\
\ \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.02776768960683392\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n\
\ \"acc_stderr\": 0.012172035157127116,\n \"acc_norm\": 0.34876140808344197,\n\
\ \"acc_norm_stderr\": 0.012172035157127116\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.03155782816556164,\n\
\ \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.03155782816556164\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n\
\ \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.5472636815920398,\n\
\ \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066165,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066165\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.39230194165552396,\n\
\ \"mc2_stderr\": 0.013964844219302998\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/LLaMA-2-7B-32K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:44:03.510382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:55.056276.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:55.056276.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:44:03.510382.parquet'
- split: 2023_08_09T14_19_55.056276
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:19:55.056276.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:19:55.056276.parquet'
- config_name: results
data_files:
- split: 2023_08_09T09_44_03.510382
path:
- results_2023-08-09T09:44:03.510382.parquet
- split: 2023_08_09T14_19_55.056276
path:
- results_2023-08-09T14:19:55.056276.parquet
- split: latest
path:
- results_2023-08-09T14:19:55.056276.parquet
---
# Dataset Card for Evaluation run of togethercomputer/LLaMA-2-7B-32K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/LLaMA-2-7B-32K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/LLaMA-2-7B-32K](https://huggingface.co/togethercomputer/LLaMA-2-7B-32K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T14:19:55.056276](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K/blob/main/results_2023-08-09T14%3A19%3A55.056276.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4354440020820605,
"acc_stderr": 0.035153041751986865,
"acc_norm": 0.4395306963568048,
"acc_norm_stderr": 0.035143043919087395,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.39230194165552396,
"mc2_stderr": 0.013964844219302998
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.47525597269624575,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.5612427803226449,
"acc_stderr": 0.00495220983185658,
"acc_norm": 0.7614021111332404,
"acc_norm_stderr": 0.004253553044707781
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4490566037735849,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.4490566037735849,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.03148955829745529,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.03148955829745529
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844072,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.03120469122515001,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.03120469122515001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5981651376146789,
"acc_stderr": 0.021020106172997016,
"acc_norm": 0.5981651376146789,
"acc_norm_stderr": 0.021020106172997016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.034478911363533815,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.034478911363533815
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.03113730429718581,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.03113730429718581
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.03142616993791924,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.03142616993791924
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.01735126811754445,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.01735126811754445
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02838425670488304,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02838425670488304
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.02834504586484063,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.02834504586484063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.02776768960683392,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.02776768960683392
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34876140808344197,
"acc_stderr": 0.012172035157127116,
"acc_norm": 0.34876140808344197,
"acc_norm_stderr": 0.012172035157127116
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.03016191193076711,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.03016191193076711
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.03155782816556164,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.03155782816556164
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.035197027175769155,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.035197027175769155
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066165,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066165
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.39230194165552396,
"mc2_stderr": 0.013964844219302998
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1 | 2023-08-27T12:37:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-Base-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-Base-7B-v0.1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:33:56.917496](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1/blob/main/results_2023-07-19T16%3A33%3A56.917496.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2836091768768163,\n\
\ \"acc_stderr\": 0.032568618896188,\n \"acc_norm\": 0.28741642067511136,\n\
\ \"acc_norm_stderr\": 0.032562772494161576,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.3303475420460559,\n\
\ \"mc2_stderr\": 0.0129988571376805\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42235494880546076,\n \"acc_stderr\": 0.014434138713379974,\n\
\ \"acc_norm\": 0.46245733788395904,\n \"acc_norm_stderr\": 0.014570144495075576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5317665803624776,\n\
\ \"acc_stderr\": 0.004979700695747948,\n \"acc_norm\": 0.7162915753833897,\n\
\ \"acc_norm_stderr\": 0.004498757194493408\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614867,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614867\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.03514942551267437,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.03514942551267437\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2709677419354839,\n \"acc_stderr\": 0.025284416114900156,\n \"\
acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.025284416114900156\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n \"\
acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.37373737373737376,\n \"acc_stderr\": 0.034468977386593325,\n \"\
acc_norm\": 0.37373737373737376,\n \"acc_norm_stderr\": 0.034468977386593325\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909902,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909902\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.023060438380857744,\n\
\ \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.023060438380857744\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230196,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230196\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02959732973097808,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02959732973097808\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.326605504587156,\n \"acc_stderr\": 0.020106990889937306,\n \"\
acc_norm\": 0.326605504587156,\n \"acc_norm_stderr\": 0.020106990889937306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456052,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842534,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842534\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14798206278026907,\n\
\ \"acc_stderr\": 0.023831557157613543,\n \"acc_norm\": 0.14798206278026907,\n\
\ \"acc_norm_stderr\": 0.023831557157613543\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934725,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934725\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n\
\ \"acc_stderr\": 0.030118210106942645,\n \"acc_norm\": 0.3034188034188034,\n\
\ \"acc_norm_stderr\": 0.030118210106942645\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n\
\ \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.27586206896551724,\n\
\ \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.02394851290546837,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.02394851290546837\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2561929595827901,\n\
\ \"acc_stderr\": 0.011149173153110582,\n \"acc_norm\": 0.2561929595827901,\n\
\ \"acc_norm_stderr\": 0.011149173153110582\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2973856209150327,\n \"acc_stderr\": 0.01849259653639695,\n \
\ \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.01849259653639695\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330434,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330434\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.3303475420460559,\n\
\ \"mc2_stderr\": 0.0129988571376805\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:56.917496.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:56.917496.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:56.917496.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:56.917496.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_33_56.917496
path:
- results_2023-07-19T16:33:56.917496.parquet
- split: latest
path:
- results_2023-07-19T16:33:56.917496.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-Base-7B-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-Base-7B-v0.1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:33:56.917496](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1/blob/main/results_2023-07-19T16%3A33%3A56.917496.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2836091768768163,
"acc_stderr": 0.032568618896188,
"acc_norm": 0.28741642067511136,
"acc_norm_stderr": 0.032562772494161576,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.3303475420460559,
"mc2_stderr": 0.0129988571376805
},
"harness|arc:challenge|25": {
"acc": 0.42235494880546076,
"acc_stderr": 0.014434138713379974,
"acc_norm": 0.46245733788395904,
"acc_norm_stderr": 0.014570144495075576
},
"harness|hellaswag|10": {
"acc": 0.5317665803624776,
"acc_stderr": 0.004979700695747948,
"acc_norm": 0.7162915753833897,
"acc_norm_stderr": 0.004498757194493408
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614867,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614867
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.03514942551267437,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.03514942551267437
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.37373737373737376,
"acc_stderr": 0.034468977386593325,
"acc_norm": 0.37373737373737376,
"acc_norm_stderr": 0.034468977386593325
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909902,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909902
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2923076923076923,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.2923076923076923,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230196,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230196
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02959732973097808,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02959732973097808
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.326605504587156,
"acc_stderr": 0.020106990889937306,
"acc_norm": 0.326605504587156,
"acc_norm_stderr": 0.020106990889937306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842534,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842534
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.14798206278026907,
"acc_stderr": 0.023831557157613543,
"acc_norm": 0.14798206278026907,
"acc_norm_stderr": 0.023831557157613543
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.030118210106942645,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.030118210106942645
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.01598281477469563,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.01598281477469563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2561929595827901,
"acc_stderr": 0.011149173153110582,
"acc_norm": 0.2561929595827901,
"acc_norm_stderr": 0.011149173153110582
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.01849259653639695,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.01849259653639695
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330434,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330434
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.3303475420460559,
"mc2_stderr": 0.0129988571376805
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1 | 2023-08-27T12:37:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-Instruct-3B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-Instruct-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T14:55:52.470090](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1/blob/main/results_2023-07-19T14%3A55%3A52.470090.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25651929626019937,\n\
\ \"acc_stderr\": 0.031626569595913936,\n \"acc_norm\": 0.25997865484677185,\n\
\ \"acc_norm_stderr\": 0.031625762756392936,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3641251521956313,\n\
\ \"mc2_stderr\": 0.013552794498811752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38310580204778155,\n \"acc_stderr\": 0.014206472661672884,\n\
\ \"acc_norm\": 0.41552901023890787,\n \"acc_norm_stderr\": 0.014401366641216383\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48317068313085043,\n\
\ \"acc_stderr\": 0.004986954139737528,\n \"acc_norm\": 0.6548496315475005,\n\
\ \"acc_norm_stderr\": 0.004744456628455116\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111835,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111835\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.19310344827586207,\n \"acc_stderr\": 0.032894455221273995,\n\
\ \"acc_norm\": 0.19310344827586207,\n \"acc_norm_stderr\": 0.032894455221273995\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415426,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415426\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994106,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.0331750593000918,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.0331750593000918\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476005,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476005\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.1794871794871795,\n \"acc_stderr\": 0.0194573907876818,\n \
\ \"acc_norm\": 0.1794871794871795,\n \"acc_norm_stderr\": 0.0194573907876818\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882374,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882374\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.01771260052872273,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.01771260052872273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355154,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355154\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.19607843137254902,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.015329888940899873,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.015329888940899873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961452,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410616,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005712,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005712\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n\
\ \"acc_stderr\": 0.01136313527865141,\n \"acc_norm\": 0.27183833116036504,\n\
\ \"acc_norm_stderr\": 0.01136313527865141\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.025187786660227248,\n\
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.025187786660227248\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834836,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834836\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3641251521956313,\n\
\ \"mc2_stderr\": 0.013552794498811752\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:55:52.470090.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:55:52.470090.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:55:52.470090.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:55:52.470090.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_55_52.470090
path:
- results_2023-07-19T14:55:52.470090.parquet
- split: latest
path:
- results_2023-07-19T14:55:52.470090.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-Instruct-3B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-Instruct-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T14:55:52.470090](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1/blob/main/results_2023-07-19T14%3A55%3A52.470090.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25651929626019937,
"acc_stderr": 0.031626569595913936,
"acc_norm": 0.25997865484677185,
"acc_norm_stderr": 0.031625762756392936,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3641251521956313,
"mc2_stderr": 0.013552794498811752
},
"harness|arc:challenge|25": {
"acc": 0.38310580204778155,
"acc_stderr": 0.014206472661672884,
"acc_norm": 0.41552901023890787,
"acc_norm_stderr": 0.014401366641216383
},
"harness|hellaswag|10": {
"acc": 0.48317068313085043,
"acc_stderr": 0.004986954139737528,
"acc_norm": 0.6548496315475005,
"acc_norm_stderr": 0.004744456628455116
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073465,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073465
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111835,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111835
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.19310344827586207,
"acc_stderr": 0.032894455221273995,
"acc_norm": 0.19310344827586207,
"acc_norm_stderr": 0.032894455221273995
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415426,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415426
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994106,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.0331750593000918,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.0331750593000918
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476005,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476005
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.0194573907876818,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.0194573907876818
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882374,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882374
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360383,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360383
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.01771260052872273,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.01771260052872273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355154,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355154
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.015329888940899873,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.015329888940899873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961452,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410616,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005712,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005712
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.01136313527865141,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.01136313527865141
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.025187786660227248,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.025187786660227248
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834836,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834836
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3641251521956313,
"mc2_stderr": 0.013552794498811752
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1 | 2023-08-27T12:37:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-Chat-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-Chat-7B-v0.1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:36:55.305122](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1/blob/main/results_2023-07-19T16%3A36%3A55.305122.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27619387525296146,\n\
\ \"acc_stderr\": 0.03233610750347649,\n \"acc_norm\": 0.27942852251055406,\n\
\ \"acc_norm_stderr\": 0.032331753456860894,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36094978005416933,\n\
\ \"mc2_stderr\": 0.01546050915660241\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892887,\n\
\ \"acc_norm\": 0.4206484641638225,\n \"acc_norm_stderr\": 0.014426211252508406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5480979884485162,\n\
\ \"acc_stderr\": 0.004966640868083862,\n \"acc_norm\": 0.7082254530969926,\n\
\ \"acc_norm_stderr\": 0.004536500714147989\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238156,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238156\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.038061426873099935,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.038061426873099935\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184766,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184766\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489614,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489614\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.02311936275823229,\n \
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.02311936275823229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29541284403669726,\n\
\ \"acc_stderr\": 0.019560619182975997,\n \"acc_norm\": 0.29541284403669726,\n\
\ \"acc_norm_stderr\": 0.019560619182975997\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491652,\n\
\ \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.04498676320572922,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.04498676320572922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n\
\ \"acc_stderr\": 0.030463656747340254,\n \"acc_norm\": 0.3162393162393162,\n\
\ \"acc_norm_stderr\": 0.030463656747340254\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.30395913154533843,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.30395913154533843,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210728,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210728\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.31189710610932475,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.01092649610203496,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.01092649610203496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244036,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244036\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.01788318813466719,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.01788318813466719\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3567251461988304,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.3567251461988304,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36094978005416933,\n\
\ \"mc2_stderr\": 0.01546050915660241\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:55.305122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:55.305122.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:36:55.305122.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:36:55.305122.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_36_55.305122
path:
- results_2023-07-19T16:36:55.305122.parquet
- split: latest
path:
- results_2023-07-19T16:36:55.305122.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-Chat-7B-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-7B-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-Chat-7B-v0.1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:36:55.305122](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1/blob/main/results_2023-07-19T16%3A36%3A55.305122.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27619387525296146,
"acc_stderr": 0.03233610750347649,
"acc_norm": 0.27942852251055406,
"acc_norm_stderr": 0.032331753456860894,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36094978005416933,
"mc2_stderr": 0.01546050915660241
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892887,
"acc_norm": 0.4206484641638225,
"acc_norm_stderr": 0.014426211252508406
},
"harness|hellaswag|10": {
"acc": 0.5480979884485162,
"acc_stderr": 0.004966640868083862,
"acc_norm": 0.7082254530969926,
"acc_norm_stderr": 0.004536500714147989
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238156,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238156
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.038061426873099935,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.038061426873099935
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184766,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184766
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489614,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489614
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.02311936275823229,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.02311936275823229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182975997,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182975997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491652,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.04498676320572922,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.04498676320572922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340254,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340254
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.30395913154533843,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.30395913154533843,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210728,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210728
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203496,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244036,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.01788318813466719,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.01788318813466719
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3567251461988304,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.3567251461988304,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36094978005416933,
"mc2_stderr": 0.01546050915660241
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct | 2023-08-27T12:37:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/RedPajama-INCITE-7B-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:41:06.835084](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct/blob/main/results_2023-07-19T16%3A41%3A06.835084.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3794636129027312,\n\
\ \"acc_stderr\": 0.034697341663262576,\n \"acc_norm\": 0.3831146924370621,\n\
\ \"acc_norm_stderr\": 0.03469085798373581,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731624,\n \"mc2\": 0.33957284047541675,\n\
\ \"mc2_stderr\": 0.013418996397984223\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4138225255972696,\n \"acc_stderr\": 0.014392730009221005,\n\
\ \"acc_norm\": 0.44112627986348124,\n \"acc_norm_stderr\": 0.014509747749064666\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5320653256323441,\n\
\ \"acc_stderr\": 0.004979510001776621,\n \"acc_norm\": 0.7201752638916551,\n\
\ \"acc_norm_stderr\": 0.004479955169853626\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41414141414141414,\n \"acc_stderr\": 0.035094383488796295,\n \"\
acc_norm\": 0.41414141414141414,\n \"acc_norm_stderr\": 0.035094383488796295\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.533678756476684,\n \"acc_stderr\": 0.03600244069867179,\n\
\ \"acc_norm\": 0.533678756476684,\n \"acc_norm_stderr\": 0.03600244069867179\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.02463554916390823,\n \
\ \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.02463554916390823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.46788990825688076,\n \"acc_stderr\": 0.021393071222680807,\n \"\
acc_norm\": 0.46788990825688076,\n \"acc_norm_stderr\": 0.021393071222680807\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.02723229846269023,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.02723229846269023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.47257383966244726,\n \"acc_stderr\": 0.032498227183013026,\n \
\ \"acc_norm\": 0.47257383966244726,\n \"acc_norm_stderr\": 0.032498227183013026\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5067264573991032,\n\
\ \"acc_stderr\": 0.033554765962343545,\n \"acc_norm\": 0.5067264573991032,\n\
\ \"acc_norm_stderr\": 0.033554765962343545\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458934,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458934\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5256410256410257,\n\
\ \"acc_stderr\": 0.03271298896811159,\n \"acc_norm\": 0.5256410256410257,\n\
\ \"acc_norm_stderr\": 0.03271298896811159\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5338441890166028,\n\
\ \"acc_stderr\": 0.017838956009136802,\n \"acc_norm\": 0.5338441890166028,\n\
\ \"acc_norm_stderr\": 0.017838956009136802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.026296227915613684,\n\
\ \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.026296227915613684\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767865,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767865\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.39228295819935693,\n\
\ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.39228295819935693,\n\
\ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4228395061728395,\n \"acc_stderr\": 0.02748747298087161,\n\
\ \"acc_norm\": 0.4228395061728395,\n \"acc_norm_stderr\": 0.02748747298087161\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3262411347517731,\n \"acc_stderr\": 0.02796845304356317,\n \
\ \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.02796845304356317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3194263363754889,\n\
\ \"acc_stderr\": 0.011908357176756158,\n \"acc_norm\": 0.3194263363754889,\n\
\ \"acc_norm_stderr\": 0.011908357176756158\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3839869281045752,\n \"acc_stderr\": 0.01967580813528152,\n \
\ \"acc_norm\": 0.3839869281045752,\n \"acc_norm_stderr\": 0.01967580813528152\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.03833185275213026,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.03833185275213026\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731624,\n \"mc2\": 0.33957284047541675,\n\
\ \"mc2_stderr\": 0.013418996397984223\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:41:06.835084.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:41:06.835084.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:41:06.835084.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:41:06.835084.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_41_06.835084
path:
- results_2023-07-19T16:41:06.835084.parquet
- split: latest
path:
- results_2023-07-19T16:41:06.835084.parquet
---
# Dataset Card for Evaluation run of togethercomputer/RedPajama-INCITE-7B-Instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:41:06.835084](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct/blob/main/results_2023-07-19T16%3A41%3A06.835084.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3794636129027312,
"acc_stderr": 0.034697341663262576,
"acc_norm": 0.3831146924370621,
"acc_norm_stderr": 0.03469085798373581,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731624,
"mc2": 0.33957284047541675,
"mc2_stderr": 0.013418996397984223
},
"harness|arc:challenge|25": {
"acc": 0.4138225255972696,
"acc_stderr": 0.014392730009221005,
"acc_norm": 0.44112627986348124,
"acc_norm_stderr": 0.014509747749064666
},
"harness|hellaswag|10": {
"acc": 0.5320653256323441,
"acc_stderr": 0.004979510001776621,
"acc_norm": 0.7201752638916551,
"acc_norm_stderr": 0.004479955169853626
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4339622641509434,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.4339622641509434,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.02815603653823321,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.02815603653823321
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41414141414141414,
"acc_stderr": 0.035094383488796295,
"acc_norm": 0.41414141414141414,
"acc_norm_stderr": 0.035094383488796295
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.533678756476684,
"acc_stderr": 0.03600244069867179,
"acc_norm": 0.533678756476684,
"acc_norm_stderr": 0.03600244069867179
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.382051282051282,
"acc_stderr": 0.02463554916390823,
"acc_norm": 0.382051282051282,
"acc_norm_stderr": 0.02463554916390823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.46788990825688076,
"acc_stderr": 0.021393071222680807,
"acc_norm": 0.46788990825688076,
"acc_norm_stderr": 0.021393071222680807
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.02723229846269023,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.02723229846269023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.47257383966244726,
"acc_stderr": 0.032498227183013026,
"acc_norm": 0.47257383966244726,
"acc_norm_stderr": 0.032498227183013026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5067264573991032,
"acc_stderr": 0.033554765962343545,
"acc_norm": 0.5067264573991032,
"acc_norm_stderr": 0.033554765962343545
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458934,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458934
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.03271298896811159,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.03271298896811159
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5338441890166028,
"acc_stderr": 0.017838956009136802,
"acc_norm": 0.5338441890166028,
"acc_norm_stderr": 0.017838956009136802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.026296227915613684,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.026296227915613684
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767865,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.39228295819935693,
"acc_stderr": 0.027731258647011998,
"acc_norm": 0.39228295819935693,
"acc_norm_stderr": 0.027731258647011998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4228395061728395,
"acc_stderr": 0.02748747298087161,
"acc_norm": 0.4228395061728395,
"acc_norm_stderr": 0.02748747298087161
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.02796845304356317,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.02796845304356317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3194263363754889,
"acc_stderr": 0.011908357176756158,
"acc_norm": 0.3194263363754889,
"acc_norm_stderr": 0.011908357176756158
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3839869281045752,
"acc_stderr": 0.01967580813528152,
"acc_norm": 0.3839869281045752,
"acc_norm_stderr": 0.01967580813528152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2571428571428571,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.2571428571428571,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.03833185275213026,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.03833185275213026
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731624,
"mc2": 0.33957284047541675,
"mc2_stderr": 0.013418996397984223
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_cyberagent__open-calm-7b | 2023-09-18T05:13:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cyberagent/open-calm-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cyberagent__open-calm-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T05:13:41.363810](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-7b/blob/main/results_2023-09-18T05-13-41.363810.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009123322147651007,\n\
\ \"em_stderr\": 0.000973701770554162,\n \"f1\": 0.039316275167785276,\n\
\ \"f1_stderr\": 0.0014587233446804973,\n \"acc\": 0.24383651483119942,\n\
\ \"acc_stderr\": 0.00767932509907164\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.009123322147651007,\n \"em_stderr\": 0.000973701770554162,\n\
\ \"f1\": 0.039316275167785276,\n \"f1_stderr\": 0.0014587233446804973\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.014046492383275842\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cyberagent/open-calm-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|drop|3_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T05-13-41.363810.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-13-41.363810.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|winogrande|5_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T05-13-41.363810.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- results_2023-07-19T16:51:08.421995.parquet
- split: 2023_09_18T05_13_41.363810
path:
- results_2023-09-18T05-13-41.363810.parquet
- split: latest
path:
- results_2023-09-18T05-13-41.363810.parquet
---
# Dataset Card for Evaluation run of cyberagent/open-calm-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cyberagent/open-calm-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cyberagent__open-calm-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T05:13:41.363810](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-7b/blob/main/results_2023-09-18T05-13-41.363810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.009123322147651007,
"em_stderr": 0.000973701770554162,
"f1": 0.039316275167785276,
"f1_stderr": 0.0014587233446804973,
"acc": 0.24383651483119942,
"acc_stderr": 0.00767932509907164
},
"harness|drop|3": {
"em": 0.009123322147651007,
"em_stderr": 0.000973701770554162,
"f1": 0.039316275167785276,
"f1_stderr": 0.0014587233446804973
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674368
},
"harness|winogrande|5": {
"acc": 0.48539857932123126,
"acc_stderr": 0.014046492383275842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_concedo__OPT-19M-ChatSalad | 2023-09-22T13:42:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of concedo/OPT-19M-ChatSalad
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [concedo/OPT-19M-ChatSalad](https://huggingface.co/concedo/OPT-19M-ChatSalad)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_concedo__OPT-19M-ChatSalad\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:42:25.445156](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__OPT-19M-ChatSalad/blob/main/results_2023-09-22T13-42-25.445156.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0024863674496644274,\n \"f1_stderr\"\
: 0.0002550496086684011,\n \"acc\": 0.24861878453038674,\n \"acc_stderr\"\
: 0.007026135605808221\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 0.0024863674496644274,\n \"\
f1_stderr\": 0.0002550496086684011\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616441\n\
\ }\n}\n```"
repo_url: https://huggingface.co/concedo/OPT-19M-ChatSalad
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_42_25.445156
path:
- '**/details_harness|drop|3_2023-09-22T13-42-25.445156.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-42-25.445156.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_42_25.445156
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-42-25.445156.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-42-25.445156.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:30:17.272494.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:30:17.272494.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:30:17.272494.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_42_25.445156
path:
- '**/details_harness|winogrande|5_2023-09-22T13-42-25.445156.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-42-25.445156.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_30_17.272494
path:
- results_2023-07-19T13:30:17.272494.parquet
- split: 2023_09_22T13_42_25.445156
path:
- results_2023-09-22T13-42-25.445156.parquet
- split: latest
path:
- results_2023-09-22T13-42-25.445156.parquet
---
# Dataset Card for Evaluation run of concedo/OPT-19M-ChatSalad
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/concedo/OPT-19M-ChatSalad
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [concedo/OPT-19M-ChatSalad](https://huggingface.co/concedo/OPT-19M-ChatSalad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_concedo__OPT-19M-ChatSalad",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:42:25.445156](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__OPT-19M-ChatSalad/blob/main/results_2023-09-22T13-42-25.445156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0024863674496644274,
"f1_stderr": 0.0002550496086684011,
"acc": 0.24861878453038674,
"acc_stderr": 0.007026135605808221
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0024863674496644274,
"f1_stderr": 0.0002550496086684011
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad | 2023-09-22T19:59:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of concedo/Pythia-70M-ChatSalad
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [concedo/Pythia-70M-ChatSalad](https://huggingface.co/concedo/Pythia-70M-ChatSalad)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:59:13.355253](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad/blob/main/results_2023-09-22T19-59-13.355253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902982634,\n \"f1\": 0.008363045302013424,\n\
\ \"f1_stderr\": 0.0006175853648384896,\n \"acc\": 0.26203630623520124,\n\
\ \"acc_stderr\": 0.0070180948326975644\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982634,\n\
\ \"f1\": 0.008363045302013424,\n \"f1_stderr\": 0.0006175853648384896\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5240726124704025,\n\
\ \"acc_stderr\": 0.014036189665395129\n }\n}\n```"
repo_url: https://huggingface.co/concedo/Pythia-70M-ChatSalad
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|drop|3_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-59-13.355253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-59-13.355253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|winogrande|5_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-59-13.355253.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- results_2023-07-19T13:36:47.045814.parquet
- split: 2023_09_22T19_59_13.355253
path:
- results_2023-09-22T19-59-13.355253.parquet
- split: latest
path:
- results_2023-09-22T19-59-13.355253.parquet
---
# Dataset Card for Evaluation run of concedo/Pythia-70M-ChatSalad
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/concedo/Pythia-70M-ChatSalad
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [concedo/Pythia-70M-ChatSalad](https://huggingface.co/concedo/Pythia-70M-ChatSalad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:59:13.355253](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad/blob/main/results_2023-09-22T19-59-13.355253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982634,
"f1": 0.008363045302013424,
"f1_stderr": 0.0006175853648384896,
"acc": 0.26203630623520124,
"acc_stderr": 0.0070180948326975644
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982634,
"f1": 0.008363045302013424,
"f1_stderr": 0.0006175853648384896
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5240726124704025,
"acc_stderr": 0.014036189665395129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_concedo__Vicuzard-30B-Uncensored | 2023-09-23T02:47:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of concedo/Vicuzard-30B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [concedo/Vicuzard-30B-Uncensored](https://huggingface.co/concedo/Vicuzard-30B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_concedo__Vicuzard-30B-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T02:47:37.236097](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Vicuzard-30B-Uncensored/blob/main/results_2023-09-23T02-47-37.236097.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17365771812080538,\n\
\ \"em_stderr\": 0.003879418958892462,\n \"f1\": 0.2676352768456391,\n\
\ \"f1_stderr\": 0.003979938331768844,\n \"acc\": 0.46250866906059396,\n\
\ \"acc_stderr\": 0.010873579764037198\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17365771812080538,\n \"em_stderr\": 0.003879418958892462,\n\
\ \"f1\": 0.2676352768456391,\n \"f1_stderr\": 0.003979938331768844\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15390447308567096,\n \
\ \"acc_stderr\": 0.009939799304049\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025395\n\
\ }\n}\n```"
repo_url: https://huggingface.co/concedo/Vicuzard-30B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T02_47_37.236097
path:
- '**/details_harness|drop|3_2023-09-23T02-47-37.236097.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T02-47-37.236097.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T02_47_37.236097
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-47-37.236097.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-47-37.236097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:20:40.681862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:20:40.681862.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:20:40.681862.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T02_47_37.236097
path:
- '**/details_harness|winogrande|5_2023-09-23T02-47-37.236097.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T02-47-37.236097.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_20_40.681862
path:
- results_2023-07-19T22:20:40.681862.parquet
- split: 2023_09_23T02_47_37.236097
path:
- results_2023-09-23T02-47-37.236097.parquet
- split: latest
path:
- results_2023-09-23T02-47-37.236097.parquet
---
# Dataset Card for Evaluation run of concedo/Vicuzard-30B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/concedo/Vicuzard-30B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [concedo/Vicuzard-30B-Uncensored](https://huggingface.co/concedo/Vicuzard-30B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_concedo__Vicuzard-30B-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T02:47:37.236097](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Vicuzard-30B-Uncensored/blob/main/results_2023-09-23T02-47-37.236097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17365771812080538,
"em_stderr": 0.003879418958892462,
"f1": 0.2676352768456391,
"f1_stderr": 0.003979938331768844,
"acc": 0.46250866906059396,
"acc_stderr": 0.010873579764037198
},
"harness|drop|3": {
"em": 0.17365771812080538,
"em_stderr": 0.003879418958892462,
"f1": 0.2676352768456391,
"f1_stderr": 0.003979938331768844
},
"harness|gsm8k|5": {
"acc": 0.15390447308567096,
"acc_stderr": 0.009939799304049
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025395
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct | 2023-08-27T12:37:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yhyhy3/open_llama_7b_v2_med_instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yhyhy3/open_llama_7b_v2_med_instruct](https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T11:52:38.098362](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct/blob/main/results_2023-07-24T11%3A52%3A38.098362.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4262190767441844,\n\
\ \"acc_stderr\": 0.035126469911175365,\n \"acc_norm\": 0.4298128076247261,\n\
\ \"acc_norm_stderr\": 0.035115194002025125,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.40332713135747483,\n\
\ \"mc2_stderr\": 0.014423706214634667\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44368600682593856,\n \"acc_stderr\": 0.014518421825670444,\n\
\ \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.01457558392201967\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.578370842461661,\n\
\ \"acc_stderr\": 0.004928105880776078,\n \"acc_norm\": 0.7690699063931488,\n\
\ \"acc_norm_stderr\": 0.004205665144562954\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655816,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655816\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4483870967741935,\n\
\ \"acc_stderr\": 0.02829205683011273,\n \"acc_norm\": 0.4483870967741935,\n\
\ \"acc_norm_stderr\": 0.02829205683011273\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.0356747133521254,\n\
\ \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.0356747133521254\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.02478431694215637,\n\
\ \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.02478431694215637\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5651376146788991,\n\
\ \"acc_stderr\": 0.021254631465609283,\n \"acc_norm\": 0.5651376146788991,\n\
\ \"acc_norm_stderr\": 0.021254631465609283\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560534,\n\
\ \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560534\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.44607843137254904,\n \"acc_stderr\": 0.03488845451304974,\n \"\
acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.03488845451304974\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693335,\n \
\ \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507748,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507748\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6130268199233716,\n\
\ \"acc_stderr\": 0.017417138059440125,\n \"acc_norm\": 0.6130268199233716,\n\
\ \"acc_norm_stderr\": 0.017417138059440125\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.476878612716763,\n \"acc_stderr\": 0.026890297881303128,\n\
\ \"acc_norm\": 0.476878612716763,\n \"acc_norm_stderr\": 0.026890297881303128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010078,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010078\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.43790849673202614,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4630225080385852,\n\
\ \"acc_stderr\": 0.02832032583010591,\n \"acc_norm\": 0.4630225080385852,\n\
\ \"acc_norm_stderr\": 0.02832032583010591\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124768,\n\
\ \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053479,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053479\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32333767926988266,\n\
\ \"acc_stderr\": 0.011946565758447204,\n \"acc_norm\": 0.32333767926988266,\n\
\ \"acc_norm_stderr\": 0.011946565758447204\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933112,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933112\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.40032679738562094,\n \"acc_stderr\": 0.01982184368827178,\n \
\ \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.01982184368827178\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278986,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278986\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5522388059701493,\n\
\ \"acc_stderr\": 0.035161847729521675,\n \"acc_norm\": 0.5522388059701493,\n\
\ \"acc_norm_stderr\": 0.035161847729521675\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.038057975055904594,\n\
\ \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.038057975055904594\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.40332713135747483,\n\
\ \"mc2_stderr\": 0.014423706214634667\n }\n}\n```"
repo_url: https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:52:38.098362.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- results_2023-07-24T11:52:38.098362.parquet
- split: latest
path:
- results_2023-07-24T11:52:38.098362.parquet
---
# Dataset Card for Evaluation run of yhyhy3/open_llama_7b_v2_med_instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yhyhy3/open_llama_7b_v2_med_instruct](https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T11:52:38.098362](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct/blob/main/results_2023-07-24T11%3A52%3A38.098362.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4262190767441844,
"acc_stderr": 0.035126469911175365,
"acc_norm": 0.4298128076247261,
"acc_norm_stderr": 0.035115194002025125,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.40332713135747483,
"mc2_stderr": 0.014423706214634667
},
"harness|arc:challenge|25": {
"acc": 0.44368600682593856,
"acc_stderr": 0.014518421825670444,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.01457558392201967
},
"harness|hellaswag|10": {
"acc": 0.578370842461661,
"acc_stderr": 0.004928105880776078,
"acc_norm": 0.7690699063931488,
"acc_norm_stderr": 0.004205665144562954
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655816,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655816
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4483870967741935,
"acc_stderr": 0.02829205683011273,
"acc_norm": 0.4483870967741935,
"acc_norm_stderr": 0.02829205683011273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.0356747133521254,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.0356747133521254
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.39487179487179486,
"acc_stderr": 0.02478431694215637,
"acc_norm": 0.39487179487179486,
"acc_norm_stderr": 0.02478431694215637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5651376146788991,
"acc_stderr": 0.021254631465609283,
"acc_norm": 0.5651376146788991,
"acc_norm_stderr": 0.021254631465609283
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560534,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560534
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693335,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507748,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507748
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6130268199233716,
"acc_stderr": 0.017417138059440125,
"acc_norm": 0.6130268199233716,
"acc_norm_stderr": 0.017417138059440125
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.476878612716763,
"acc_stderr": 0.026890297881303128,
"acc_norm": 0.476878612716763,
"acc_norm_stderr": 0.026890297881303128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010078,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4630225080385852,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.4630225080385852,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.02788913930053479,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.02788913930053479
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32333767926988266,
"acc_stderr": 0.011946565758447204,
"acc_norm": 0.32333767926988266,
"acc_norm_stderr": 0.011946565758447204
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933112,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933112
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40032679738562094,
"acc_stderr": 0.01982184368827178,
"acc_norm": 0.40032679738562094,
"acc_norm_stderr": 0.01982184368827178
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278986,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278986
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5522388059701493,
"acc_stderr": 0.035161847729521675,
"acc_norm": 0.5522388059701493,
"acc_norm_stderr": 0.035161847729521675
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.40332713135747483,
"mc2_stderr": 0.014423706214634667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b | 2023-09-26T02:39:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yhyhy3/med-orca-instruct-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yhyhy3/med-orca-instruct-33b](https://huggingface.co/yhyhy3/med-orca-instruct-33b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-26T02:39:23.109820](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b/blob/main/results_2023-09-26T02-39-23.109820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 3.565436241610739e-05,\n \"f1_stderr\"\
: 1.0790982405422018e-05,\n \"acc\": 0.23954222573007103,\n \"acc_stderr\"\
: 0.007020092747106471\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 3.565436241610739e-05,\n \"\
f1_stderr\": 1.0790982405422018e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.47908445146014206,\n \"acc_stderr\": 0.014040185494212942\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yhyhy3/med-orca-instruct-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|drop|3_2023-09-26T02-39-23.109820.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-26T02-39-23.109820.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|gsm8k|5_2023-09-26T02-39-23.109820.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-26T02-39-23.109820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|winogrande|5_2023-09-26T02-39-23.109820.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-26T02-39-23.109820.parquet'
- config_name: results
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- results_2023-08-09T13:49:32.359108.parquet
- split: 2023_08_18T09_03_49.045450
path:
- results_2023-08-18T09:03:49.045450.parquet
- split: 2023_09_26T02_39_23.109820
path:
- results_2023-09-26T02-39-23.109820.parquet
- split: latest
path:
- results_2023-09-26T02-39-23.109820.parquet
---
# Dataset Card for Evaluation run of yhyhy3/med-orca-instruct-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yhyhy3/med-orca-instruct-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yhyhy3/med-orca-instruct-33b](https://huggingface.co/yhyhy3/med-orca-instruct-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-26T02:39:23.109820](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b/blob/main/results_2023-09-26T02-39-23.109820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 3.565436241610739e-05,
"f1_stderr": 1.0790982405422018e-05,
"acc": 0.23954222573007103,
"acc_stderr": 0.007020092747106471
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 3.565436241610739e-05,
"f1_stderr": 1.0790982405422018e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.47908445146014206,
"acc_stderr": 0.014040185494212942
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Ejafa__vicuna_7B_vanilla_1.1 | 2023-08-27T12:37:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Ejafa/vicuna_7B_vanilla_1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ejafa/vicuna_7B_vanilla_1.1](https://huggingface.co/Ejafa/vicuna_7B_vanilla_1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ejafa__vicuna_7B_vanilla_1.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:40:36.774019](https://huggingface.co/datasets/open-llm-leaderboard/details_Ejafa__vicuna_7B_vanilla_1.1/blob/main/results_2023-07-19T16%3A40%3A36.774019.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4591867762728675,\n\
\ \"acc_stderr\": 0.035239355625612076,\n \"acc_norm\": 0.46304684097226445,\n\
\ \"acc_norm_stderr\": 0.03522600205076519,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.48940747456304606,\n\
\ \"mc2_stderr\": 0.015298126884049629\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4991467576791809,\n \"acc_stderr\": 0.014611369529813279,\n\
\ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756989\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5844453296156145,\n\
\ \"acc_stderr\": 0.004918102168717934,\n \"acc_norm\": 0.7746464847639912,\n\
\ \"acc_norm_stderr\": 0.00416961025480796\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535886,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535886\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4612903225806452,\n \"acc_stderr\": 0.028358634859836928,\n \"\
acc_norm\": 0.4612903225806452,\n \"acc_norm_stderr\": 0.028358634859836928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"\
acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.038435669935887186,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.038435669935887186\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267634,\n\
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267634\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.021004201260420075,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.021004201260420075\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809117,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.0343413116471913,\n \"acc_norm\"\
: 0.6029411764705882,\n \"acc_norm_stderr\": 0.0343413116471913\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \"\
acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.030463656747340265,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.030463656747340265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.01726860756000578,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.01726860756000578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098409,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098409\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.027801656212323667,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.027801656212323667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650147,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n\
\ \"acc_stderr\": 0.01217203515712712,\n \"acc_norm\": 0.34876140808344197,\n\
\ \"acc_norm_stderr\": 0.01217203515712712\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.48940747456304606,\n\
\ \"mc2_stderr\": 0.015298126884049629\n }\n}\n```"
repo_url: https://huggingface.co/Ejafa/vicuna_7B_vanilla_1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:36.774019.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:40:36.774019.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:40:36.774019.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:40:36.774019.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_40_36.774019
path:
- results_2023-07-19T16:40:36.774019.parquet
- split: latest
path:
- results_2023-07-19T16:40:36.774019.parquet
---
# Dataset Card for Evaluation run of Ejafa/vicuna_7B_vanilla_1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Ejafa/vicuna_7B_vanilla_1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Ejafa/vicuna_7B_vanilla_1.1](https://huggingface.co/Ejafa/vicuna_7B_vanilla_1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ejafa__vicuna_7B_vanilla_1.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:40:36.774019](https://huggingface.co/datasets/open-llm-leaderboard/details_Ejafa__vicuna_7B_vanilla_1.1/blob/main/results_2023-07-19T16%3A40%3A36.774019.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4591867762728675,
"acc_stderr": 0.035239355625612076,
"acc_norm": 0.46304684097226445,
"acc_norm_stderr": 0.03522600205076519,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.48940747456304606,
"mc2_stderr": 0.015298126884049629
},
"harness|arc:challenge|25": {
"acc": 0.4991467576791809,
"acc_stderr": 0.014611369529813279,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.014572000527756989
},
"harness|hellaswag|10": {
"acc": 0.5844453296156145,
"acc_stderr": 0.004918102168717934,
"acc_norm": 0.7746464847639912,
"acc_norm_stderr": 0.00416961025480796
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535886,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535886
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4612903225806452,
"acc_stderr": 0.028358634859836928,
"acc_norm": 0.4612903225806452,
"acc_norm_stderr": 0.028358634859836928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.038435669935887186,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.038435669935887186
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267634,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267634
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6,
"acc_stderr": 0.021004201260420075,
"acc_norm": 0.6,
"acc_norm_stderr": 0.021004201260420075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.0343413116471913,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.0343413116471913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340265,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.01726860756000578,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.01726860756000578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149123,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149123
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098409,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098409
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.027801656212323667,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.027801656212323667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650147,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34876140808344197,
"acc_stderr": 0.01217203515712712,
"acc_norm": 0.34876140808344197,
"acc_norm_stderr": 0.01217203515712712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495302,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.48940747456304606,
"mc2_stderr": 0.015298126884049629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b | 2023-08-27T12:37:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oasst1-512-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oasst1-512-12b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T18:11:10.994515](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b/blob/main/results_2023-07-19T18%3A11%3A10.994515.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2670048335411601,\n\
\ \"acc_stderr\": 0.03183146115681845,\n \"acc_norm\": 0.27040012961165716,\n\
\ \"acc_norm_stderr\": 0.03182609816055044,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.364114682052683,\n\
\ \"mc2_stderr\": 0.013501776376430328\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40273037542662116,\n \"acc_stderr\": 0.014332236306790147,\n\
\ \"acc_norm\": 0.4232081911262799,\n \"acc_norm_stderr\": 0.014438036220848029\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5226050587532364,\n\
\ \"acc_stderr\": 0.004984679359375621,\n \"acc_norm\": 0.7024497112129058,\n\
\ \"acc_norm_stderr\": 0.004562462665505218\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029471,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029471\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512321984,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512321984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"\
acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233486,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233486\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267052,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267052\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971527,\n\
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971527\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341926,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.017765978652327562,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.017765978652327562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859686,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859686\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004243,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004243\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3116219667943806,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.3116219667943806,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n\
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.01100597139992724,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.01100597139992724\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541107,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541107\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28594771241830064,\n \"acc_stderr\": 0.018280485072954676,\n \
\ \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.018280485072954676\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007636,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007636\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355554,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.364114682052683,\n\
\ \"mc2_stderr\": 0.013501776376430328\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:11:10.994515.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- results_2023-07-19T18:11:10.994515.parquet
- split: latest
path:
- results_2023-07-19T18:11:10.994515.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oasst1-512-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oasst1-512-12b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T18:11:10.994515](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b/blob/main/results_2023-07-19T18%3A11%3A10.994515.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2670048335411601,
"acc_stderr": 0.03183146115681845,
"acc_norm": 0.27040012961165716,
"acc_norm_stderr": 0.03182609816055044,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602587,
"mc2": 0.364114682052683,
"mc2_stderr": 0.013501776376430328
},
"harness|arc:challenge|25": {
"acc": 0.40273037542662116,
"acc_stderr": 0.014332236306790147,
"acc_norm": 0.4232081911262799,
"acc_norm_stderr": 0.014438036220848029
},
"harness|hellaswag|10": {
"acc": 0.5226050587532364,
"acc_stderr": 0.004984679359375621,
"acc_norm": 0.7024497112129058,
"acc_norm_stderr": 0.004562462665505218
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029471,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029471
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512321984,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512321984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233486,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233486
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.022139081103971527,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.022139081103971527
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341926,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.017765978652327562,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.017765978652327562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859686,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859686
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.0372767357559692,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.0372767357559692
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004243,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004243
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3116219667943806,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.3116219667943806,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.02600480036395211,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.02600480036395211
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.01100597139992724,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.01100597139992724
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541107,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541107
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28594771241830064,
"acc_stderr": 0.018280485072954676,
"acc_norm": 0.28594771241830064,
"acc_norm_stderr": 0.018280485072954676
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355554,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602587,
"mc2": 0.364114682052683,
"mc2_stderr": 0.013501776376430328
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b | 2023-09-23T02:26:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oig-oasst1-256-6_9b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oig-oasst1-256-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T02:25:51.324956](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b/blob/main/results_2023-09-23T02-25-51.324956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.0002568002749723939,\n \"f1\": 0.04599517617449677,\n\
\ \"f1_stderr\": 0.0011593544147047532,\n \"acc\": 0.3248508682225,\n\
\ \"acc_stderr\": 0.008493981824488952\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749723939,\n\
\ \"f1\": 0.04599517617449677,\n \"f1_stderr\": 0.0011593544147047532\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6337805840568271,\n \"acc_stderr\": 0.013540144376588901\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|drop|3_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T02-25-51.324956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-25-51.324956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|winogrande|5_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T02-25-51.324956.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- results_2023-07-19T17:44:24.016368.parquet
- split: 2023_09_23T02_25_51.324956
path:
- results_2023-09-23T02-25-51.324956.parquet
- split: latest
path:
- results_2023-09-23T02-25-51.324956.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oig-oasst1-256-6_9b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oig-oasst1-256-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T02:25:51.324956](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b/blob/main/results_2023-09-23T02-25-51.324956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723939,
"f1": 0.04599517617449677,
"f1_stderr": 0.0011593544147047532,
"acc": 0.3248508682225,
"acc_stderr": 0.008493981824488952
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723939,
"f1": 0.04599517617449677,
"f1_stderr": 0.0011593544147047532
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890037
},
"harness|winogrande|5": {
"acc": 0.6337805840568271,
"acc_stderr": 0.013540144376588901
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b | 2023-08-27T12:37:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-1024-12b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-12b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T13:01:13.696108](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b/blob/main/results_2023-07-18T13%3A01%3A13.696108.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26555608203945363,\n\
\ \"acc_stderr\": 0.031884177636802055,\n \"acc_norm\": 0.26906684156397603,\n\
\ \"acc_norm_stderr\": 0.031879451552421924,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731615,\n \"mc2\": 0.37997429044548037,\n\
\ \"mc2_stderr\": 0.013906650687558298\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4052901023890785,\n \"acc_stderr\": 0.014346869060229325,\n\
\ \"acc_norm\": 0.4308873720136519,\n \"acc_norm_stderr\": 0.014471133392642471\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5159330810595499,\n\
\ \"acc_stderr\": 0.004987247325495627,\n \"acc_norm\": 0.6974706233817964,\n\
\ \"acc_norm_stderr\": 0.004584144014654923\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051982,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051982\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029471,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029471\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.18620689655172415,\n \"acc_stderr\": 0.032439461590046174,\n\
\ \"acc_norm\": 0.18620689655172415,\n \"acc_norm_stderr\": 0.032439461590046174\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325618,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325618\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.034550710191021496,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.034550710191021496\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026935,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026935\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722727,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.025130453652268455,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.025130453652268455\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251742,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21940928270042195,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.21940928270042195,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.0281209665039144,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.0281209665039144\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2950191570881226,\n\
\ \"acc_stderr\": 0.016308363772932724,\n \"acc_norm\": 0.2950191570881226,\n\
\ \"acc_norm_stderr\": 0.016308363772932724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258165,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258165\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.025403832978179622,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.025403832978179622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596727,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432407,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034961,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.025035845227711243,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.025035845227711243\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594722,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594722\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245233,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245233\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731615,\n \"mc2\": 0.37997429044548037,\n\
\ \"mc2_stderr\": 0.013906650687558298\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:01:13.696108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:01:13.696108.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:01:13.696108.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:01:13.696108.parquet'
- config_name: results
data_files:
- split: 2023_07_18T13_01_13.696108
path:
- results_2023-07-18T13:01:13.696108.parquet
- split: latest
path:
- results_2023-07-18T13:01:13.696108.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-1024-12b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T13:01:13.696108](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b/blob/main/results_2023-07-18T13%3A01%3A13.696108.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26555608203945363,
"acc_stderr": 0.031884177636802055,
"acc_norm": 0.26906684156397603,
"acc_norm_stderr": 0.031879451552421924,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731615,
"mc2": 0.37997429044548037,
"mc2_stderr": 0.013906650687558298
},
"harness|arc:challenge|25": {
"acc": 0.4052901023890785,
"acc_stderr": 0.014346869060229325,
"acc_norm": 0.4308873720136519,
"acc_norm_stderr": 0.014471133392642471
},
"harness|hellaswag|10": {
"acc": 0.5159330810595499,
"acc_stderr": 0.004987247325495627,
"acc_norm": 0.6974706233817964,
"acc_norm_stderr": 0.004584144014654923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051982,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051982
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029471,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029471
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.18620689655172415,
"acc_stderr": 0.032439461590046174,
"acc_norm": 0.18620689655172415,
"acc_norm_stderr": 0.032439461590046174
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325618,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325618
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.034550710191021496,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.034550710191021496
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026935,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.025130453652268455,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.025130453652268455
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.029331162294251742,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.029331162294251742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21940928270042195,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.21940928270042195,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.0281209665039144,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.0281209665039144
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2950191570881226,
"acc_stderr": 0.016308363772932724,
"acc_norm": 0.2950191570881226,
"acc_norm_stderr": 0.016308363772932724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.025403832978179622,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.025403832978179622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596727,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432407,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034961,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.025035845227711243,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.025035845227711243
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594722,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245233,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245233
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731615,
"mc2": 0.37997429044548037,
"mc2_stderr": 0.013906650687558298
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b | 2023-09-22T15:42:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oig-oasst1-512-6_9b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oig-oasst1-512-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-512-6_9b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:42:09.246441](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b/blob/main/results_2023-09-22T15-42-09.246441.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00041946308724832214,\n\
\ \"em_stderr\": 0.00020969854707829136,\n \"f1\": 0.04082843959731554,\n\
\ \"f1_stderr\": 0.00109074720252004,\n \"acc\": 0.3174773048631111,\n\
\ \"acc_stderr\": 0.00816331055041483\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.00020969854707829136,\n\
\ \"f1\": 0.04082843959731554,\n \"f1_stderr\": 0.00109074720252004\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \
\ \"acc_stderr\": 0.002721076577041661\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6250986582478295,\n \"acc_stderr\": 0.013605544523787998\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oig-oasst1-512-6_9b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_42_09.246441
path:
- '**/details_harness|drop|3_2023-09-22T15-42-09.246441.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-42-09.246441.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_42_09.246441
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-42-09.246441.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-42-09.246441.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:42:14.893645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:42:14.893645.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:42:14.893645.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_42_09.246441
path:
- '**/details_harness|winogrande|5_2023-09-22T15-42-09.246441.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-42-09.246441.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_42_14.893645
path:
- results_2023-07-19T17:42:14.893645.parquet
- split: 2023_09_22T15_42_09.246441
path:
- results_2023-09-22T15-42-09.246441.parquet
- split: latest
path:
- results_2023-09-22T15-42-09.246441.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oig-oasst1-512-6_9b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oig-oasst1-512-6_9b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oig-oasst1-512-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-512-6_9b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:42:09.246441](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b/blob/main/results_2023-09-22T15-42-09.246441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829136,
"f1": 0.04082843959731554,
"f1_stderr": 0.00109074720252004,
"acc": 0.3174773048631111,
"acc_stderr": 0.00816331055041483
},
"harness|drop|3": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829136,
"f1": 0.04082843959731554,
"f1_stderr": 0.00109074720252004
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.002721076577041661
},
"harness|winogrande|5": {
"acc": 0.6250986582478295,
"acc_stderr": 0.013605544523787998
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt | 2023-08-27T12:37:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T17:21:26.476069](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt/blob/main/results_2023-07-19T17%3A21%3A26.476069.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2806711118058944,\n\
\ \"acc_stderr\": 0.03225349934716525,\n \"acc_norm\": 0.2837639442040318,\n\
\ \"acc_norm_stderr\": 0.03225483921398825,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.4200454196574927,\n\
\ \"mc2_stderr\": 0.014213928953331009\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.37627986348122866,\n \"acc_stderr\": 0.014157022555407175,\n\
\ \"acc_norm\": 0.4129692832764505,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4785899223262298,\n\
\ \"acc_stderr\": 0.004985204766555062,\n \"acc_norm\": 0.6243776140211114,\n\
\ \"acc_norm_stderr\": 0.004832934529120798\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.0402477840197711,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.0402477840197711\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193339,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193339\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.02394672474156398,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.02394672474156398\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n\
\ \"acc_stderr\": 0.022238985469323774,\n \"acc_norm\": 0.12556053811659193,\n\
\ \"acc_norm_stderr\": 0.022238985469323774\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20306513409961685,\n\
\ \"acc_stderr\": 0.014385525076611578,\n \"acc_norm\": 0.20306513409961685,\n\
\ \"acc_norm_stderr\": 0.014385525076611578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n\
\ \"acc_stderr\": 0.011422153194553577,\n \"acc_norm\": 0.27640156453715775,\n\
\ \"acc_norm_stderr\": 0.011422153194553577\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146627,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146627\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18072289156626506,\n\
\ \"acc_stderr\": 0.02995573785581014,\n \"acc_norm\": 0.18072289156626506,\n\
\ \"acc_norm_stderr\": 0.02995573785581014\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.4200454196574927,\n\
\ \"mc2_stderr\": 0.014213928953331009\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:21:26.476069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:21:26.476069.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:21:26.476069.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:21:26.476069.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_21_26.476069
path:
- results_2023-07-19T17:21:26.476069.parquet
- split: latest
path:
- results_2023-07-19T17:21:26.476069.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T17:21:26.476069](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt/blob/main/results_2023-07-19T17%3A21%3A26.476069.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2806711118058944,
"acc_stderr": 0.03225349934716525,
"acc_norm": 0.2837639442040318,
"acc_norm_stderr": 0.03225483921398825,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.4200454196574927,
"mc2_stderr": 0.014213928953331009
},
"harness|arc:challenge|25": {
"acc": 0.37627986348122866,
"acc_stderr": 0.014157022555407175,
"acc_norm": 0.4129692832764505,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.4785899223262298,
"acc_stderr": 0.004985204766555062,
"acc_norm": 0.6243776140211114,
"acc_norm_stderr": 0.004832934529120798
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.0402477840197711,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.0402477840197711
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193339,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193339
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.02394672474156398,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.02394672474156398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.022238985469323774,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.022238985469323774
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20306513409961685,
"acc_stderr": 0.014385525076611578,
"acc_norm": 0.20306513409961685,
"acc_norm_stderr": 0.014385525076611578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27640156453715775,
"acc_stderr": 0.011422153194553577,
"acc_norm": 0.27640156453715775,
"acc_norm_stderr": 0.011422153194553577
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146627,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18072289156626506,
"acc_stderr": 0.02995573785581014,
"acc_norm": 0.18072289156626506,
"acc_norm_stderr": 0.02995573785581014
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.4200454196574927,
"mc2_stderr": 0.014213928953331009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b | 2023-08-27T12:37:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T21:35:35.780060](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b/blob/main/results_2023-07-19T21%3A35%3A35.780060.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.267309798020508,\n\
\ \"acc_stderr\": 0.03194870950912509,\n \"acc_norm\": 0.2713118331128494,\n\
\ \"acc_norm_stderr\": 0.03194156512180179,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39924157003004007,\n\
\ \"mc2_stderr\": 0.014866348805427145\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4351535836177474,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.4803754266211604,\n \"acc_norm_stderr\": 0.014600132075947094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5367456681935869,\n\
\ \"acc_stderr\": 0.004976288321681822,\n \"acc_norm\": 0.7276438956383191,\n\
\ \"acc_norm_stderr\": 0.00444262359084632\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361063,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361063\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.03147830790259575,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.03147830790259575\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895535,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895535\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586832,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586832\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417394,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417394\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.2036697247706422,\n\
\ \"acc_stderr\": 0.017266742087630797,\n \"acc_norm\": 0.2036697247706422,\n\
\ \"acc_norm_stderr\": 0.017266742087630797\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.16203703703703703,\n \"acc_stderr\": 0.025130453652268455,\n\
\ \"acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.025130453652268455\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291936,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291936\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650744,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650744\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n\
\ \"acc_stderr\": 0.016139174096522556,\n \"acc_norm\": 0.2848020434227331,\n\
\ \"acc_norm_stderr\": 0.016139174096522556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317015,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317015\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n\
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621365,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26727509778357234,\n\
\ \"acc_stderr\": 0.011302607515637515,\n \"acc_norm\": 0.26727509778357234,\n\
\ \"acc_norm_stderr\": 0.011302607515637515\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.036310534964889056,\n\
\ \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.036310534964889056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39924157003004007,\n\
\ \"mc2_stderr\": 0.014866348805427145\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:35:35.780060.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- results_2023-07-19T21:35:35.780060.parquet
- split: latest
path:
- results_2023-07-19T21:35:35.780060.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T21:35:35.780060](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b/blob/main/results_2023-07-19T21%3A35%3A35.780060.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.267309798020508,
"acc_stderr": 0.03194870950912509,
"acc_norm": 0.2713118331128494,
"acc_norm_stderr": 0.03194156512180179,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39924157003004007,
"mc2_stderr": 0.014866348805427145
},
"harness|arc:challenge|25": {
"acc": 0.4351535836177474,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.4803754266211604,
"acc_norm_stderr": 0.014600132075947094
},
"harness|hellaswag|10": {
"acc": 0.5367456681935869,
"acc_stderr": 0.004976288321681822,
"acc_norm": 0.7276438956383191,
"acc_norm_stderr": 0.00444262359084632
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361063,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361063
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.03147830790259575,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.03147830790259575
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895535,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895535
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586832,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586832
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417394,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417394
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2036697247706422,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.2036697247706422,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.025130453652268455,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.025130453652268455
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291936,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650744,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650744
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.016139174096522556,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.016139174096522556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317015,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317015
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.02600480036395211,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.02600480036395211
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621365,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26727509778357234,
"acc_stderr": 0.011302607515637515,
"acc_norm": 0.26727509778357234,
"acc_norm_stderr": 0.011302607515637515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.036310534964889056,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.036310534964889056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39924157003004007,
"mc2_stderr": 0.014866348805427145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b | 2023-08-27T12:37:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oasst1-512-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oasst1-512-20b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T21:43:07.012781](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b/blob/main/results_2023-07-19T21%3A43%3A07.012781.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27000963699095726,\n\
\ \"acc_stderr\": 0.03207462279029465,\n \"acc_norm\": 0.2738892269037556,\n\
\ \"acc_norm_stderr\": 0.032067484735330685,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.3749705897753555,\n\
\ \"mc2_stderr\": 0.014205445237088549\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4308873720136519,\n \"acc_stderr\": 0.014471133392642475,\n\
\ \"acc_norm\": 0.46928327645051193,\n \"acc_norm_stderr\": 0.014583792546304038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5372435769766979,\n\
\ \"acc_stderr\": 0.004975919665116535,\n \"acc_norm\": 0.7277434773949413,\n\
\ \"acc_norm_stderr\": 0.004442115268580939\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051975,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051975\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325645,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325645\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.20967741935483872,\n \"acc_stderr\": 0.023157879349083536,\n \"\
acc_norm\": 0.20967741935483872,\n \"acc_norm_stderr\": 0.023157879349083536\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916647,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916647\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02047323317355198,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02047323317355198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21651376146788992,\n \"acc_stderr\": 0.01765871059444313,\n \"\
acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.01765871059444313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.024837173518242384,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.024837173518242384\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083497,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083497\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.03623089915724148,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.03623089915724148\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.01605079214803653,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.01605079214803653\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257803,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808836,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808836\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.02573885479781873,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.02573885479781873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.31189710610932475,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596727,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.01788318813466719,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.01788318813466719\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.040139645540727735,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.040139645540727735\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145277,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145277\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935555,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935555\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.3749705897753555,\n\
\ \"mc2_stderr\": 0.014205445237088549\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:07.012781.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- results_2023-07-19T21:43:07.012781.parquet
- split: latest
path:
- results_2023-07-19T21:43:07.012781.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oasst1-512-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oasst1-512-20b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T21:43:07.012781](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b/blob/main/results_2023-07-19T21%3A43%3A07.012781.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27000963699095726,
"acc_stderr": 0.03207462279029465,
"acc_norm": 0.2738892269037556,
"acc_norm_stderr": 0.032067484735330685,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.3749705897753555,
"mc2_stderr": 0.014205445237088549
},
"harness|arc:challenge|25": {
"acc": 0.4308873720136519,
"acc_stderr": 0.014471133392642475,
"acc_norm": 0.46928327645051193,
"acc_norm_stderr": 0.014583792546304038
},
"harness|hellaswag|10": {
"acc": 0.5372435769766979,
"acc_stderr": 0.004975919665116535,
"acc_norm": 0.7277434773949413,
"acc_norm_stderr": 0.004442115268580939
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051975,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051975
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325645,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325645
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421255,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421255
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.20967741935483872,
"acc_stderr": 0.023157879349083536,
"acc_norm": 0.20967741935483872,
"acc_norm_stderr": 0.023157879349083536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916647,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916647
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02047323317355198,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02047323317355198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514565,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21651376146788992,
"acc_stderr": 0.01765871059444313,
"acc_norm": 0.21651376146788992,
"acc_norm_stderr": 0.01765871059444313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.024837173518242384,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.024837173518242384
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969174,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969174
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083497,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083497
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.03623089915724148,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.03623089915724148
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803653,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803653
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808836,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596727,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.01788318813466719,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.01788318813466719
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727735,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727735
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145277,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145277
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935555,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935555
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.3749705897753555,
"mc2_stderr": 0.014205445237088549
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2 | 2023-09-23T01:16:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T01:16:14.347906](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2/blob/main/results_2023-09-23T01-16-14.347906.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0964765100671141,\n\
\ \"em_stderr\": 0.0030235709755854464,\n \"f1\": 0.15010381711409398,\n\
\ \"f1_stderr\": 0.0032252432502273593,\n \"acc\": 0.32434164506008656,\n\
\ \"acc_stderr\": 0.007374349538733694\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0964765100671141,\n \"em_stderr\": 0.0030235709755854464,\n\
\ \"f1\": 0.15010381711409398,\n \"f1_stderr\": 0.0032252432502273593\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674337\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599954\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|drop|3_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T01-16-14.347906.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-16-14.347906.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|winogrande|5_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T01-16-14.347906.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- results_2023-07-19T17:24:55.002122.parquet
- split: 2023_09_23T01_16_14.347906
path:
- results_2023-09-23T01-16-14.347906.parquet
- split: latest
path:
- results_2023-09-23T01-16-14.347906.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T01:16:14.347906](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2/blob/main/results_2023-09-23T01-16-14.347906.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0964765100671141,
"em_stderr": 0.0030235709755854464,
"f1": 0.15010381711409398,
"f1_stderr": 0.0032252432502273593,
"acc": 0.32434164506008656,
"acc_stderr": 0.007374349538733694
},
"harness|drop|3": {
"em": 0.0964765100671141,
"em_stderr": 0.0030235709755854464,
"f1": 0.15010381711409398,
"f1_stderr": 0.0032252432502273593
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674337
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599954
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b | 2023-08-27T12:38:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-multilang-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T21:26:27.370097](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b/blob/main/results_2023-07-19T21%3A26%3A27.370097.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27118463499837264,\n\
\ \"acc_stderr\": 0.03214355436593283,\n \"acc_norm\": 0.2750548266560701,\n\
\ \"acc_norm_stderr\": 0.03213630798718139,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.34391318826475564,\n\
\ \"mc2_stderr\": 0.013947824945193848\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43686006825938567,\n \"acc_stderr\": 0.014494421584256527,\n\
\ \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.014592230885298959\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5349531965743876,\n\
\ \"acc_stderr\": 0.004977574188421318,\n \"acc_norm\": 0.7257518422624976,\n\
\ \"acc_norm_stderr\": 0.004452228541043549\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036847,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036847\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749912,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749912\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.02468597928623996,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.02468597928623996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.20689655172413793,\n \"acc_stderr\": 0.028501378167893946,\n \"\
acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.028501378167893946\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511783,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511783\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560465,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560465\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24954128440366974,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.24954128440366974,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.02541642838876748,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.02541642838876748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3137254901960784,\n \"acc_stderr\": 0.03256685484460387,\n \"\
acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.03256685484460387\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857483,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857483\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523418,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523418\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n\
\ \"acc_stderr\": 0.015818450894777566,\n \"acc_norm\": 0.2669220945083014,\n\
\ \"acc_norm_stderr\": 0.015818450894777566\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917202,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917202\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n\
\ \"acc_stderr\": 0.011186109046564613,\n \"acc_norm\": 0.2588005215123859,\n\
\ \"acc_norm_stderr\": 0.011186109046564613\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594722,\n \"\
acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594722\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307748,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935554,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.34391318826475564,\n\
\ \"mc2_stderr\": 0.013947824945193848\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:26:27.370097.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- results_2023-07-19T21:26:27.370097.parquet
- split: latest
path:
- results_2023-07-19T21:26:27.370097.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-multilang-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T21:26:27.370097](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b/blob/main/results_2023-07-19T21%3A26%3A27.370097.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27118463499837264,
"acc_stderr": 0.03214355436593283,
"acc_norm": 0.2750548266560701,
"acc_norm_stderr": 0.03213630798718139,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.34391318826475564,
"mc2_stderr": 0.013947824945193848
},
"harness|arc:challenge|25": {
"acc": 0.43686006825938567,
"acc_stderr": 0.014494421584256527,
"acc_norm": 0.47440273037542663,
"acc_norm_stderr": 0.014592230885298959
},
"harness|hellaswag|10": {
"acc": 0.5349531965743876,
"acc_stderr": 0.004977574188421318,
"acc_norm": 0.7257518422624976,
"acc_norm_stderr": 0.004452228541043549
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749912,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749912
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.028501378167893946,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.028501378167893946
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511783,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511783
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560465,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560465
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24954128440366974,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.24954128440366974,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.02541642838876748,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.02541642838876748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.03256685484460387,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.03256685484460387
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857483,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857483
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523418,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523418
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.015818450894777566,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.015818450894777566
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.024752411960917202,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.024752411960917202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261427,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2588005215123859,
"acc_stderr": 0.011186109046564613,
"acc_norm": 0.2588005215123859,
"acc_norm_stderr": 0.011186109046564613
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594722,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935554,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.34391318826475564,
"mc2_stderr": 0.013947824945193848
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_AGI-inc__lora_moe_7b | 2023-08-27T12:38:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of AGI-inc/lora_moe_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AGI-inc/lora_moe_7b](https://huggingface.co/AGI-inc/lora_moe_7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AGI-inc__lora_moe_7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T11:46:25.370436](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-inc__lora_moe_7b/blob/main/results_2023-07-24T11%3A46%3A25.370436.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3624349655819883,\n\
\ \"acc_stderr\": 0.03457932037185986,\n \"acc_norm\": 0.36641755034742307,\n\
\ \"acc_norm_stderr\": 0.03456622803809125,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3433554241758255,\n\
\ \"mc2_stderr\": 0.01319092242364727\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47696245733788395,\n \"acc_stderr\": 0.014595873205358267,\n\
\ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5754829715196176,\n\
\ \"acc_stderr\": 0.004932593348813628,\n \"acc_norm\": 0.7780322644891456,\n\
\ \"acc_norm_stderr\": 0.004147202539759587\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3622641509433962,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.3622641509433962,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.33548387096774196,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.33548387096774196,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45077720207253885,\n \"acc_stderr\": 0.03590910952235525,\n\
\ \"acc_norm\": 0.45077720207253885,\n \"acc_norm_stderr\": 0.03590910952235525\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48440366972477067,\n \"acc_stderr\": 0.02142689153920805,\n \"\
acc_norm\": 0.48440366972477067,\n \"acc_norm_stderr\": 0.02142689153920805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329881,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329881\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.35784313725490197,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.43037974683544306,\n \"acc_stderr\": 0.03223017195937598,\n \
\ \"acc_norm\": 0.43037974683544306,\n \"acc_norm_stderr\": 0.03223017195937598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\
\ \"acc_stderr\": 0.032867453125679603,\n \"acc_norm\": 0.3991031390134529,\n\
\ \"acc_norm_stderr\": 0.032867453125679603\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278133,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278133\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.47863247863247865,\n\
\ \"acc_stderr\": 0.03272616447634954,\n \"acc_norm\": 0.47863247863247865,\n\
\ \"acc_norm_stderr\": 0.03272616447634954\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4278416347381865,\n\
\ \"acc_stderr\": 0.01769278792780373,\n \"acc_norm\": 0.4278416347381865,\n\
\ \"acc_norm_stderr\": 0.01769278792780373\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3901734104046243,\n \"acc_stderr\": 0.026261677607806653,\n\
\ \"acc_norm\": 0.3901734104046243,\n \"acc_norm_stderr\": 0.026261677607806653\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631445,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631445\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3987138263665595,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.3987138263665595,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3487654320987654,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.3487654320987654,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29595827900912647,\n\
\ \"acc_stderr\": 0.011658518525277054,\n \"acc_norm\": 0.29595827900912647,\n\
\ \"acc_norm_stderr\": 0.011658518525277054\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079706,\n \
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079706\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.03530235517334682,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.03530235517334682\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4853801169590643,\n \"acc_stderr\": 0.038331852752130205,\n\
\ \"acc_norm\": 0.4853801169590643,\n \"acc_norm_stderr\": 0.038331852752130205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3433554241758255,\n\
\ \"mc2_stderr\": 0.01319092242364727\n }\n}\n```"
repo_url: https://huggingface.co/AGI-inc/lora_moe_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:46:25.370436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:46:25.370436.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:46:25.370436.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:46:25.370436.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_46_25.370436
path:
- results_2023-07-24T11:46:25.370436.parquet
- split: latest
path:
- results_2023-07-24T11:46:25.370436.parquet
---
# Dataset Card for Evaluation run of AGI-inc/lora_moe_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AGI-inc/lora_moe_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AGI-inc/lora_moe_7b](https://huggingface.co/AGI-inc/lora_moe_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AGI-inc__lora_moe_7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T11:46:25.370436](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-inc__lora_moe_7b/blob/main/results_2023-07-24T11%3A46%3A25.370436.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3624349655819883,
"acc_stderr": 0.03457932037185986,
"acc_norm": 0.36641755034742307,
"acc_norm_stderr": 0.03456622803809125,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3433554241758255,
"mc2_stderr": 0.01319092242364727
},
"harness|arc:challenge|25": {
"acc": 0.47696245733788395,
"acc_stderr": 0.014595873205358267,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065
},
"harness|hellaswag|10": {
"acc": 0.5754829715196176,
"acc_stderr": 0.004932593348813628,
"acc_norm": 0.7780322644891456,
"acc_norm_stderr": 0.004147202539759587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3622641509433962,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.3622641509433962,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33548387096774196,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.33548387096774196,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45077720207253885,
"acc_stderr": 0.03590910952235525,
"acc_norm": 0.45077720207253885,
"acc_norm_stderr": 0.03590910952235525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603854
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48440366972477067,
"acc_stderr": 0.02142689153920805,
"acc_norm": 0.48440366972477067,
"acc_norm_stderr": 0.02142689153920805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329881,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35784313725490197,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.35784313725490197,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.43037974683544306,
"acc_stderr": 0.03223017195937598,
"acc_norm": 0.43037974683544306,
"acc_norm_stderr": 0.03223017195937598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.032867453125679603,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.032867453125679603
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278133,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278133
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.47863247863247865,
"acc_stderr": 0.03272616447634954,
"acc_norm": 0.47863247863247865,
"acc_norm_stderr": 0.03272616447634954
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4278416347381865,
"acc_stderr": 0.01769278792780373,
"acc_norm": 0.4278416347381865,
"acc_norm_stderr": 0.01769278792780373
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3901734104046243,
"acc_stderr": 0.026261677607806653,
"acc_norm": 0.3901734104046243,
"acc_norm_stderr": 0.026261677607806653
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.027996723180631445,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.027996723180631445
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3987138263665595,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.3987138263665595,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3487654320987654,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.3487654320987654,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29595827900912647,
"acc_stderr": 0.011658518525277054,
"acc_norm": 0.29595827900912647,
"acc_norm_stderr": 0.011658518525277054
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079706,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079706
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.34285714285714286,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.34285714285714286,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.03530235517334682,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.03530235517334682
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288085,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288085
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4853801169590643,
"acc_stderr": 0.038331852752130205,
"acc_norm": 0.4853801169590643,
"acc_norm_stderr": 0.038331852752130205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3433554241758255,
"mc2_stderr": 0.01319092242364727
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_AGI-inc__lora_moe_7b_baseline | 2023-08-27T12:38:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of AGI-inc/lora_moe_7b_baseline
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AGI-inc/lora_moe_7b_baseline](https://huggingface.co/AGI-inc/lora_moe_7b_baseline)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AGI-inc__lora_moe_7b_baseline\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T11:38:46.147581](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-inc__lora_moe_7b_baseline/blob/main/results_2023-07-24T11%3A38%3A46.147581.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3624349655819883,\n\
\ \"acc_stderr\": 0.03457932037185986,\n \"acc_norm\": 0.36641755034742307,\n\
\ \"acc_norm_stderr\": 0.03456622803809125,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3433554241758255,\n\
\ \"mc2_stderr\": 0.01319092242364727\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47696245733788395,\n \"acc_stderr\": 0.014595873205358267,\n\
\ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5754829715196176,\n\
\ \"acc_stderr\": 0.004932593348813628,\n \"acc_norm\": 0.7780322644891456,\n\
\ \"acc_norm_stderr\": 0.004147202539759587\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3622641509433962,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.3622641509433962,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.33548387096774196,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.33548387096774196,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45077720207253885,\n \"acc_stderr\": 0.03590910952235525,\n\
\ \"acc_norm\": 0.45077720207253885,\n \"acc_norm_stderr\": 0.03590910952235525\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48440366972477067,\n \"acc_stderr\": 0.02142689153920805,\n \"\
acc_norm\": 0.48440366972477067,\n \"acc_norm_stderr\": 0.02142689153920805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329881,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329881\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.35784313725490197,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.43037974683544306,\n \"acc_stderr\": 0.03223017195937598,\n \
\ \"acc_norm\": 0.43037974683544306,\n \"acc_norm_stderr\": 0.03223017195937598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\
\ \"acc_stderr\": 0.032867453125679603,\n \"acc_norm\": 0.3991031390134529,\n\
\ \"acc_norm_stderr\": 0.032867453125679603\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278133,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278133\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.47863247863247865,\n\
\ \"acc_stderr\": 0.03272616447634954,\n \"acc_norm\": 0.47863247863247865,\n\
\ \"acc_norm_stderr\": 0.03272616447634954\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4278416347381865,\n\
\ \"acc_stderr\": 0.01769278792780373,\n \"acc_norm\": 0.4278416347381865,\n\
\ \"acc_norm_stderr\": 0.01769278792780373\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3901734104046243,\n \"acc_stderr\": 0.026261677607806653,\n\
\ \"acc_norm\": 0.3901734104046243,\n \"acc_norm_stderr\": 0.026261677607806653\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631445,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631445\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3987138263665595,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.3987138263665595,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3487654320987654,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.3487654320987654,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29595827900912647,\n\
\ \"acc_stderr\": 0.011658518525277054,\n \"acc_norm\": 0.29595827900912647,\n\
\ \"acc_norm_stderr\": 0.011658518525277054\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079706,\n \
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079706\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.03530235517334682,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.03530235517334682\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4853801169590643,\n \"acc_stderr\": 0.038331852752130205,\n\
\ \"acc_norm\": 0.4853801169590643,\n \"acc_norm_stderr\": 0.038331852752130205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3433554241758255,\n\
\ \"mc2_stderr\": 0.01319092242364727\n }\n}\n```"
repo_url: https://huggingface.co/AGI-inc/lora_moe_7b_baseline
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:38:46.147581.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:38:46.147581.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:38:46.147581.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:38:46.147581.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_38_46.147581
path:
- results_2023-07-24T11:38:46.147581.parquet
- split: latest
path:
- results_2023-07-24T11:38:46.147581.parquet
---
# Dataset Card for Evaluation run of AGI-inc/lora_moe_7b_baseline
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AGI-inc/lora_moe_7b_baseline
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AGI-inc/lora_moe_7b_baseline](https://huggingface.co/AGI-inc/lora_moe_7b_baseline) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AGI-inc__lora_moe_7b_baseline",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T11:38:46.147581](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-inc__lora_moe_7b_baseline/blob/main/results_2023-07-24T11%3A38%3A46.147581.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3624349655819883,
"acc_stderr": 0.03457932037185986,
"acc_norm": 0.36641755034742307,
"acc_norm_stderr": 0.03456622803809125,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3433554241758255,
"mc2_stderr": 0.01319092242364727
},
"harness|arc:challenge|25": {
"acc": 0.47696245733788395,
"acc_stderr": 0.014595873205358267,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065
},
"harness|hellaswag|10": {
"acc": 0.5754829715196176,
"acc_stderr": 0.004932593348813628,
"acc_norm": 0.7780322644891456,
"acc_norm_stderr": 0.004147202539759587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3622641509433962,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.3622641509433962,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33548387096774196,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.33548387096774196,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45077720207253885,
"acc_stderr": 0.03590910952235525,
"acc_norm": 0.45077720207253885,
"acc_norm_stderr": 0.03590910952235525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603854
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48440366972477067,
"acc_stderr": 0.02142689153920805,
"acc_norm": 0.48440366972477067,
"acc_norm_stderr": 0.02142689153920805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329881,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35784313725490197,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.35784313725490197,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.43037974683544306,
"acc_stderr": 0.03223017195937598,
"acc_norm": 0.43037974683544306,
"acc_norm_stderr": 0.03223017195937598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.032867453125679603,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.032867453125679603
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278133,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278133
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.47863247863247865,
"acc_stderr": 0.03272616447634954,
"acc_norm": 0.47863247863247865,
"acc_norm_stderr": 0.03272616447634954
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4278416347381865,
"acc_stderr": 0.01769278792780373,
"acc_norm": 0.4278416347381865,
"acc_norm_stderr": 0.01769278792780373
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3901734104046243,
"acc_stderr": 0.026261677607806653,
"acc_norm": 0.3901734104046243,
"acc_norm_stderr": 0.026261677607806653
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.027996723180631445,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.027996723180631445
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3987138263665595,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.3987138263665595,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3487654320987654,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.3487654320987654,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29595827900912647,
"acc_stderr": 0.011658518525277054,
"acc_norm": 0.29595827900912647,
"acc_norm_stderr": 0.011658518525277054
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079706,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079706
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.34285714285714286,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.34285714285714286,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.03530235517334682,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.03530235517334682
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288085,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288085
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4853801169590643,
"acc_stderr": 0.038331852752130205,
"acc_norm": 0.4853801169590643,
"acc_norm_stderr": 0.038331852752130205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3433554241758255,
"mc2_stderr": 0.01319092242364727
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_64bits__LexPodLM-13B | 2023-09-17T03:54:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of 64bits/LexPodLM-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [64bits/LexPodLM-13B](https://huggingface.co/64bits/LexPodLM-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_64bits__LexPodLM-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T03:54:30.274513](https://huggingface.co/datasets/open-llm-leaderboard/details_64bits__LexPodLM-13B/blob/main/results_2023-09-17T03-54-30.274513.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.36052852348993286,\n\
\ \"em_stderr\": 0.004917224706996595,\n \"f1\": 0.3970910234899336,\n\
\ \"f1_stderr\": 0.004831848289565604,\n \"acc\": 0.3808208366219416,\n\
\ \"acc_stderr\": 0.005987474333851151\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.36052852348993286,\n \"em_stderr\": 0.004917224706996595,\n\
\ \"f1\": 0.3970910234899336,\n \"f1_stderr\": 0.004831848289565604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n\
\ \"acc_stderr\": 0.011974948667702302\n }\n}\n```"
repo_url: https://huggingface.co/64bits/LexPodLM-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|arc:challenge|25_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T03_54_30.274513
path:
- '**/details_harness|drop|3_2023-09-17T03-54-30.274513.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T03-54-30.274513.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T03_54_30.274513
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-54-30.274513.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-54-30.274513.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hellaswag|10_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T13:41:51.227672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T13:41:51.227672.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T13:41:51.227672.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T03_54_30.274513
path:
- '**/details_harness|winogrande|5_2023-09-17T03-54-30.274513.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T03-54-30.274513.parquet'
- config_name: results
data_files:
- split: 2023_07_25T13_41_51.227672
path:
- results_2023-07-25T13:41:51.227672.parquet
- split: 2023_09_17T03_54_30.274513
path:
- results_2023-09-17T03-54-30.274513.parquet
- split: latest
path:
- results_2023-09-17T03-54-30.274513.parquet
---
# Dataset Card for Evaluation run of 64bits/LexPodLM-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/64bits/LexPodLM-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [64bits/LexPodLM-13B](https://huggingface.co/64bits/LexPodLM-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_64bits__LexPodLM-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T03:54:30.274513](https://huggingface.co/datasets/open-llm-leaderboard/details_64bits__LexPodLM-13B/blob/main/results_2023-09-17T03-54-30.274513.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.36052852348993286,
"em_stderr": 0.004917224706996595,
"f1": 0.3970910234899336,
"f1_stderr": 0.004831848289565604,
"acc": 0.3808208366219416,
"acc_stderr": 0.005987474333851151
},
"harness|drop|3": {
"em": 0.36052852348993286,
"em_stderr": 0.004917224706996595,
"f1": 0.3970910234899336,
"f1_stderr": 0.004831848289565604
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702302
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_huggyllama__llama-65b | 2023-08-28T20:22:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 119 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-65b\"\
,\n\t\"original_mmlu_world_religions_5\",\n\tsplit=\"train\")\n```\n\n## Latest\
\ results\n\nThese are the [latest results from run 2023-08-28T20:22:03.470786](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-65b/blob/main/results_2023-08-28T20%3A22%3A03.470786.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377292373869259,\n\
\ \"acc_stderr\": 0.033716462325154156\n },\n \"original|mmlu:abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316\n },\n\
\ \"original|mmlu:anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \
\ \"acc_stderr\": 0.04266763404099582\n },\n \"original|mmlu:astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898\n\
\ },\n \"original|mmlu:business_ethics|5\": {\n \"acc\": 0.59,\n \
\ \"acc_stderr\": 0.04943110704237102\n },\n \"original|mmlu:clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328\n\
\ },\n \"original|mmlu:college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644\n },\n \"original|mmlu:college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795\n },\n\
\ \"original|mmlu:college_computer_science|5\": {\n \"acc\": 0.46,\n \
\ \"acc_stderr\": 0.05009082659620332\n },\n \"original|mmlu:college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218\n },\n\
\ \"original|mmlu:college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498\n },\n \"original|mmlu:college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946\n\
\ },\n \"original|mmlu:computer_security|5\": {\n \"acc\": 0.8,\n \
\ \"acc_stderr\": 0.04020151261036846\n },\n \"original|mmlu:conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712\n\
\ },\n \"original|mmlu:econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.04598188057816541\n },\n \"original|mmlu:electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757\n\
\ },\n \"original|mmlu:elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n\
\ \"acc_stderr\": 0.02525303255499769\n },\n \"original|mmlu:formal_logic|5\"\
: {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466\n\
\ },\n \"original|mmlu:global_facts|5\": {\n \"acc\": 0.38,\n \
\ \"acc_stderr\": 0.048783173121456316\n },\n \"original|mmlu:high_school_biology|5\"\
: {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283\n\
\ },\n \"original|mmlu:high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n\
\ \"acc_stderr\": 0.03465304488406795\n },\n \"original|mmlu:high_school_computer_science|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316\n },\n\
\ \"original|mmlu:high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.03317505930009182\n },\n \"original|mmlu:high_school_geography|5\"\
: {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026703\n\
\ },\n \"original|mmlu:high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121444\n\
\ },\n \"original|mmlu:high_school_macroeconomics|5\": {\n \"acc\"\
: 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635467\n },\n \
\ \"original|mmlu:high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n\
\ \"acc_stderr\": 0.02897264888484427\n },\n \"original|mmlu:high_school_microeconomics|5\"\
: {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434\n\
\ },\n \"original|mmlu:high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n\
\ \"acc_stderr\": 0.03943966699183629\n },\n \"original|mmlu:high_school_psychology|5\"\
: {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347\n\
\ },\n \"original|mmlu:high_school_statistics|5\": {\n \"acc\": 0.6157407407407407,\n\
\ \"acc_stderr\": 0.03317354514310742\n },\n \"original|mmlu:high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124065\n\
\ },\n \"original|mmlu:high_school_world_history|5\": {\n \"acc\":\
\ 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567\n },\n \
\ \"original|mmlu:human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \
\ \"acc_stderr\": 0.03160295143776679\n },\n \"original|mmlu:human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729\n\
\ },\n \"original|mmlu:international_law|5\": {\n \"acc\": 0.8181818181818182,\n\
\ \"acc_stderr\": 0.035208939510976534\n },\n \"original|mmlu:jurisprudence|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315\n\
\ },\n \"original|mmlu:logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n\
\ \"acc_stderr\": 0.03291099578615769\n },\n \"original|mmlu:machine_learning|5\"\
: {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546\n\
\ },\n \"original|mmlu:management|5\": {\n \"acc\": 0.8252427184466019,\n\
\ \"acc_stderr\": 0.03760178006026621\n },\n \"original|mmlu:marketing|5\"\
: {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333\n\
\ },\n \"original|mmlu:medical_genetics|5\": {\n \"acc\": 0.69,\n \
\ \"acc_stderr\": 0.04648231987117317\n },\n \"original|mmlu:miscellaneous|5\"\
: {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001506\n\
\ },\n \"original|mmlu:moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.023786203255508297\n },\n \"original|mmlu:moral_scenarios|5\"\
: {\n \"acc\": 0.4748603351955307,\n \"acc_stderr\": 0.01670135084268263\n\
\ },\n \"original|mmlu:nutrition|5\": {\n \"acc\": 0.6895424836601307,\n\
\ \"acc_stderr\": 0.026493033225145894\n },\n \"original|mmlu:philosophy|5\"\
: {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816657\n\
\ },\n \"original|mmlu:prehistory|5\": {\n \"acc\": 0.7376543209876543,\n\
\ \"acc_stderr\": 0.024477222856135118\n },\n \"original|mmlu:professional_accounting|5\"\
: {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206\n\
\ },\n \"original|mmlu:professional_law|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.012770236105969923\n },\n \"original|mmlu:professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613\n\
\ },\n \"original|mmlu:professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n\
\ \"acc_stderr\": 0.019117213911495144\n },\n \"original|mmlu:public_relations|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383\n\
\ },\n \"original|mmlu:security_studies|5\": {\n \"acc\": 0.7224489795918367,\n\
\ \"acc_stderr\": 0.028666857790274655\n },\n \"original|mmlu:sociology|5\"\
: {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801301\n\
\ },\n \"original|mmlu:us_foreign_policy|5\": {\n \"acc\": 0.88,\n\
\ \"acc_stderr\": 0.03265986323710906\n },\n \"original|mmlu:virology|5\"\
: {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767\n\
\ },\n \"original|mmlu:world_religions|5\": {\n \"acc\": 0.8128654970760234,\n\
\ \"acc_stderr\": 0.029913127232368043\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|arc:challenge|25_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hellaswag|10_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-21T02:59:30.993672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:59:30.993672.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-21T02:59:30.993672.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-21T02:59:30.993672.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:22:03.470786.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:22:03.470786.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_22_03.470786
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:22:03.470786.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:22:03.470786.parquet'
- config_name: results
data_files:
- split: 2023_07_21T02_59_30.993672
path:
- results_2023-07-21T02:59:30.993672.parquet
- split: 2023_08_28T20_22_03.470786
path:
- results_2023-08-28T20:22:03.470786.parquet
- split: latest
path:
- results_2023-08-28T20:22:03.470786.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 119 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-65b",
"original_mmlu_world_religions_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T20:22:03.470786](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-65b/blob/main/results_2023-08-28T20%3A22%3A03.470786.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377292373869259,
"acc_stderr": 0.033716462325154156
},
"original|mmlu:abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316
},
"original|mmlu:anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582
},
"original|mmlu:astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898
},
"original|mmlu:business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102
},
"original|mmlu:clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798328
},
"original|mmlu:college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644
},
"original|mmlu:college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795
},
"original|mmlu:college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332
},
"original|mmlu:college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218
},
"original|mmlu:college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498
},
"original|mmlu:college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946
},
"original|mmlu:computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846
},
"original|mmlu:conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712
},
"original|mmlu:econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.04598188057816541
},
"original|mmlu:electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757
},
"original|mmlu:elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769
},
"original|mmlu:formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466
},
"original|mmlu:global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316
},
"original|mmlu:high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283
},
"original|mmlu:high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795
},
"original|mmlu:high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316
},
"original|mmlu:high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182
},
"original|mmlu:high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026703
},
"original|mmlu:high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121444
},
"original|mmlu:high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635467
},
"original|mmlu:high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427
},
"original|mmlu:high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434
},
"original|mmlu:high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629
},
"original|mmlu:high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347
},
"original|mmlu:high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742
},
"original|mmlu:high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124065
},
"original|mmlu:high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567
},
"original|mmlu:human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679
},
"original|mmlu:human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729
},
"original|mmlu:international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534
},
"original|mmlu:jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315
},
"original|mmlu:logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769
},
"original|mmlu:machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546
},
"original|mmlu:management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621
},
"original|mmlu:marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333
},
"original|mmlu:medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117317
},
"original|mmlu:miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506
},
"original|mmlu:moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297
},
"original|mmlu:moral_scenarios|5": {
"acc": 0.4748603351955307,
"acc_stderr": 0.01670135084268263
},
"original|mmlu:nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894
},
"original|mmlu:philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816657
},
"original|mmlu:prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118
},
"original|mmlu:professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206
},
"original|mmlu:professional_law|5": {
"acc": 0.5,
"acc_stderr": 0.012770236105969923
},
"original|mmlu:professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613
},
"original|mmlu:professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144
},
"original|mmlu:public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383
},
"original|mmlu:security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274655
},
"original|mmlu:sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301
},
"original|mmlu:us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906
},
"original|mmlu:virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767
},
"original|mmlu:world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_huggyllama__llama-13b | 2023-09-23T10:41:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of huggyllama/llama-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggyllama/llama-13b](https://huggingface.co/huggyllama/llama-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T10:41:44.150256](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-13b/blob/main/results_2023-09-23T10-41-44.150256.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.056602348993288636,\n\
\ \"f1_stderr\": 0.0013004668300984712,\n \"acc\": 0.4191229752993855,\n\
\ \"acc_stderr\": 0.009626252314482865\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n\
\ \"f1\": 0.056602348993288636,\n \"f1_stderr\": 0.0013004668300984712\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.007291205723162579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\
\ }\n}\n```"
repo_url: https://huggingface.co/huggyllama/llama-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|arc:challenge|25_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|drop|3_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T10-41-44.150256.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|gsm8k|5_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T10-41-44.150256.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hellaswag|10_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|winogrande|5_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T10-41-44.150256.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- config_name: results
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- results_2023-07-24T15:13:44.970123.parquet
- split: 2023_08_19T22_15_08.436043
path:
- results_2023-08-19T22:15:08.436043.parquet
- split: 2023_08_28T19_54_33.085163
path:
- results_2023-08-28T19:54:33.085163.parquet
- split: 2023_09_23T10_41_44.150256
path:
- results_2023-09-23T10-41-44.150256.parquet
- split: latest
path:
- results_2023-09-23T10-41-44.150256.parquet
---
# Dataset Card for Evaluation run of huggyllama/llama-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggyllama/llama-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggyllama/llama-13b](https://huggingface.co/huggyllama/llama-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T10:41:44.150256](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-13b/blob/main/results_2023-09-23T10-41-44.150256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712,
"acc": 0.4191229752993855,
"acc_stderr": 0.009626252314482865
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.007291205723162579
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_pillowtalks-ai__delta13b | 2023-09-20T15:25:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pillowtalks-ai/delta13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pillowtalks-ai/delta13b](https://huggingface.co/pillowtalks-ai/delta13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pillowtalks-ai__delta13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-20T15:24:50.493097](https://huggingface.co/datasets/open-llm-leaderboard/details_pillowtalks-ai__delta13b/blob/main/results_2023-09-20T15-24-50.493097.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n\
\ \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n\
\ \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n\
\ \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n\
\ \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pillowtalks-ai/delta13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_20T15_24_50.493097
path:
- '**/details_harness|drop|3_2023-09-20T15-24-50.493097.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-20T15-24-50.493097.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_20T15_24_50.493097
path:
- '**/details_harness|gsm8k|5_2023-09-20T15-24-50.493097.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-20T15-24-50.493097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:54:11.410236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:54:11.410236.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:54:11.410236.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_20T15_24_50.493097
path:
- '**/details_harness|winogrande|5_2023-09-20T15-24-50.493097.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-20T15-24-50.493097.parquet'
- config_name: results
data_files:
- split: 2023_07_18T13_54_11.410236
path:
- results_2023-07-18T13:54:11.410236.parquet
- split: 2023_09_20T15_24_50.493097
path:
- results_2023-09-20T15-24-50.493097.parquet
- split: latest
path:
- results_2023-09-20T15-24-50.493097.parquet
---
# Dataset Card for Evaluation run of pillowtalks-ai/delta13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pillowtalks-ai/delta13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pillowtalks-ai/delta13b](https://huggingface.co/pillowtalks-ai/delta13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pillowtalks-ai__delta13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-20T15:24:50.493097](https://huggingface.co/datasets/open-llm-leaderboard/details_pillowtalks-ai__delta13b/blob/main/results_2023-09-20T15-24-50.493097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146,
"acc": 0.4141695683211732,
"acc_stderr": 0.010019161585538096
},
"harness|drop|3": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.00774004433710381
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Quake24__easyTermsSummerizer | 2023-08-27T12:38:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Quake24/easyTermsSummerizer
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Quake24/easyTermsSummerizer](https://huggingface.co/Quake24/easyTermsSummerizer)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Quake24__easyTermsSummerizer\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T13:57:43.173192](https://huggingface.co/datasets/open-llm-leaderboard/details_Quake24__easyTermsSummerizer/blob/main/results_2023-08-09T13%3A57%3A43.173192.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23106423923206632,\n\
\ \"acc_stderr\": 0.030702330041556567,\n \"acc_norm\": 0.23207464238328082,\n\
\ \"acc_norm_stderr\": 0.03072110922373694,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4768728256786111,\n\
\ \"mc2_stderr\": 0.016533971116179963\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888657,\n\
\ \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768409\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25652260505875324,\n\
\ \"acc_stderr\": 0.004358210689442268,\n \"acc_norm\": 0.2581159131647082,\n\
\ \"acc_norm_stderr\": 0.0043670376322045255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n\
\ \"mc2\": 0.4768728256786111,\n \"mc2_stderr\": 0.016533971116179963\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Quake24/easyTermsSummerizer
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:57:43.173192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:57:43.173192.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:57:43.173192.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:57:43.173192.parquet'
- config_name: results
data_files:
- split: 2023_08_09T13_57_43.173192
path:
- results_2023-08-09T13:57:43.173192.parquet
- split: latest
path:
- results_2023-08-09T13:57:43.173192.parquet
---
# Dataset Card for Evaluation run of Quake24/easyTermsSummerizer
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Quake24/easyTermsSummerizer
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Quake24/easyTermsSummerizer](https://huggingface.co/Quake24/easyTermsSummerizer) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Quake24__easyTermsSummerizer",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T13:57:43.173192](https://huggingface.co/datasets/open-llm-leaderboard/details_Quake24__easyTermsSummerizer/blob/main/results_2023-08-09T13%3A57%3A43.173192.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23106423923206632,
"acc_stderr": 0.030702330041556567,
"acc_norm": 0.23207464238328082,
"acc_norm_stderr": 0.03072110922373694,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4768728256786111,
"mc2_stderr": 0.016533971116179963
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888657,
"acc_norm": 0.257679180887372,
"acc_norm_stderr": 0.012780770562768409
},
"harness|hellaswag|10": {
"acc": 0.25652260505875324,
"acc_stderr": 0.004358210689442268,
"acc_norm": 0.2581159131647082,
"acc_norm_stderr": 0.0043670376322045255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4768728256786111,
"mc2_stderr": 0.016533971116179963
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PSanni__Deer-3b | 2023-09-16T20:50:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PSanni/Deer-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PSanni/Deer-3b](https://huggingface.co/PSanni/Deer-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PSanni__Deer-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T20:50:46.284611](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__Deer-3b/blob/main/results_2023-09-16T20-50-46.284611.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0003145973154362416,\n\
\ \"em_stderr\": 0.0001816137946883968,\n \"f1\": 0.04833053691275181,\n\
\ \"f1_stderr\": 0.0011657715269814616,\n \"acc\": 0.28880911790700303,\n\
\ \"acc_stderr\": 0.0077049156139354594\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0003145973154362416,\n \"em_stderr\": 0.0001816137946883968,\n\
\ \"f1\": 0.04833053691275181,\n \"f1_stderr\": 0.0011657715269814616\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245434\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.574585635359116,\n \"acc_stderr\": 0.013895257666646375\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PSanni/Deer-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T20_50_46.284611
path:
- '**/details_harness|drop|3_2023-09-16T20-50-46.284611.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T20-50-46.284611.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T20_50_46.284611
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-50-46.284611.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-50-46.284611.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:13:49.318775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:13:49.318775.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:13:49.318775.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T20_50_46.284611
path:
- '**/details_harness|winogrande|5_2023-09-16T20-50-46.284611.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T20-50-46.284611.parquet'
- config_name: results
data_files:
- split: 2023_08_09T14_13_49.318775
path:
- results_2023-08-09T14:13:49.318775.parquet
- split: 2023_09_16T20_50_46.284611
path:
- results_2023-09-16T20-50-46.284611.parquet
- split: latest
path:
- results_2023-09-16T20-50-46.284611.parquet
---
# Dataset Card for Evaluation run of PSanni/Deer-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PSanni/Deer-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PSanni/Deer-3b](https://huggingface.co/PSanni/Deer-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PSanni__Deer-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T20:50:46.284611](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__Deer-3b/blob/main/results_2023-09-16T20-50-46.284611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946883968,
"f1": 0.04833053691275181,
"f1_stderr": 0.0011657715269814616,
"acc": 0.28880911790700303,
"acc_stderr": 0.0077049156139354594
},
"harness|drop|3": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946883968,
"f1": 0.04833053691275181,
"f1_stderr": 0.0011657715269814616
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245434
},
"harness|winogrande|5": {
"acc": 0.574585635359116,
"acc_stderr": 0.013895257666646375
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3 | 2023-08-27T12:38:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of davzoku/cria-llama2-7b-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davzoku/cria-llama2-7b-v1.3](https://huggingface.co/davzoku/cria-llama2-7b-v1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-16T09:58:26.810126](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3/blob/main/results_2023-08-16T09%3A58%3A26.810126.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4851084960551683,\n\
\ \"acc_stderr\": 0.03507077744313449,\n \"acc_norm\": 0.4888591199028739,\n\
\ \"acc_norm_stderr\": 0.03505688086251662,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100616,\n \"mc2\": 0.4558320321123706,\n\
\ \"mc2_stderr\": 0.015693738897689977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985996\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5977892850029874,\n\
\ \"acc_stderr\": 0.004893418929918276,\n \"acc_norm\": 0.7857996415056762,\n\
\ \"acc_norm_stderr\": 0.004094279871733674\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655805,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655805\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"\
acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n \"\
acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341532,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341532\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n\
\ \"acc_stderr\": 0.033205746129454324,\n \"acc_norm\": 0.6617647058823529,\n\
\ \"acc_norm_stderr\": 0.033205746129454324\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6751054852320675,\n \"acc_stderr\": 0.03048603938910529,\n\
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.03048603938910529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456606,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456606\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674074,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674074\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.0167409290471627,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.0167409290471627\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21899441340782122,\n\
\ \"acc_stderr\": 0.013831676687303188,\n \"acc_norm\": 0.21899441340782122,\n\
\ \"acc_norm_stderr\": 0.013831676687303188\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n\
\ \"acc_stderr\": 0.02815023224453559,\n \"acc_norm\": 0.5659163987138264,\n\
\ \"acc_norm_stderr\": 0.02815023224453559\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607704,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3513689700130378,\n\
\ \"acc_stderr\": 0.012192969457484023,\n \"acc_norm\": 0.3513689700130378,\n\
\ \"acc_norm_stderr\": 0.012192969457484023\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976684,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976684\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4803921568627451,\n \"acc_stderr\": 0.020212274976302957,\n \
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.020212274976302957\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100616,\n \"mc2\": 0.4558320321123706,\n\
\ \"mc2_stderr\": 0.015693738897689977\n }\n}\n```"
repo_url: https://huggingface.co/davzoku/cria-llama2-7b-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|arc:challenge|25_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hellaswag|10_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T09:58:26.810126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T09:58:26.810126.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T09:58:26.810126.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T09:58:26.810126.parquet'
- config_name: results
data_files:
- split: 2023_08_16T09_58_26.810126
path:
- results_2023-08-16T09:58:26.810126.parquet
- split: latest
path:
- results_2023-08-16T09:58:26.810126.parquet
---
# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/davzoku/cria-llama2-7b-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [davzoku/cria-llama2-7b-v1.3](https://huggingface.co/davzoku/cria-llama2-7b-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-16T09:58:26.810126](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3/blob/main/results_2023-08-16T09%3A58%3A26.810126.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4851084960551683,
"acc_stderr": 0.03507077744313449,
"acc_norm": 0.4888591199028739,
"acc_norm_stderr": 0.03505688086251662,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100616,
"mc2": 0.4558320321123706,
"mc2_stderr": 0.015693738897689977
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255793,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985996
},
"harness|hellaswag|10": {
"acc": 0.5977892850029874,
"acc_stderr": 0.004893418929918276,
"acc_norm": 0.7857996415056762,
"acc_norm_stderr": 0.004094279871733674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655805,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655805
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.034812853382329624,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.034812853382329624
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184408,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184408
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.02004811592341532,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.02004811592341532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.033205746129454324,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.033205746129454324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.03048603938910529,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.03048603938910529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456606,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456606
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674074,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674074
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.0167409290471627,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.0167409290471627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21899441340782122,
"acc_stderr": 0.013831676687303188,
"acc_norm": 0.21899441340782122,
"acc_norm_stderr": 0.013831676687303188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5659163987138264,
"acc_stderr": 0.02815023224453559,
"acc_norm": 0.5659163987138264,
"acc_norm_stderr": 0.02815023224453559
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607704,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3513689700130378,
"acc_stderr": 0.012192969457484023,
"acc_norm": 0.3513689700130378,
"acc_norm_stderr": 0.012192969457484023
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976684,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976684
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.020212274976302957,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.020212274976302957
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100616,
"mc2": 0.4558320321123706,
"mc2_stderr": 0.015693738897689977
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_hakurei__lotus-12B | 2023-08-27T12:38:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of hakurei/lotus-12B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hakurei/lotus-12B](https://huggingface.co/hakurei/lotus-12B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hakurei__lotus-12B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T13:41:37.836572](https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__lotus-12B/blob/main/results_2023-07-18T13%3A41%3A37.836572.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2485367177317206,\n\
\ \"acc_stderr\": 0.03124226591981919,\n \"acc_norm\": 0.25130617872419225,\n\
\ \"acc_norm_stderr\": 0.03125347081115218,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.40115476804436745,\n\
\ \"mc2_stderr\": 0.014756133562988513\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26535836177474403,\n \"acc_stderr\": 0.012902554762313962,\n\
\ \"acc_norm\": 0.30716723549488056,\n \"acc_norm_stderr\": 0.013481034054980945\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4054969129655447,\n\
\ \"acc_stderr\": 0.0048998450871831105,\n \"acc_norm\": 0.5270862378012349,\n\
\ \"acc_norm_stderr\": 0.004982454383162063\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.02737770662467071,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.02737770662467071\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.03001755447188055,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.03001755447188055\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102148,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102148\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462833,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.24242424242424243,\n \"acc_stderr\": 0.030532892233932036,\n \"\
acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.030532892233932036\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.02543511943810535,\n\
\ \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.02543511943810535\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.031911001928357954,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.031911001928357954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591206,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.04656147110012352,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.04656147110012352\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392916,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.027421007295392916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604612,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604612\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.02472386150477169,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.02472386150477169\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799204,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799204\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461004,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461004\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.010885929742002205,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.010885929742002205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142317,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528037,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528037\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348387,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348387\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.40115476804436745,\n\
\ \"mc2_stderr\": 0.014756133562988513\n }\n}\n```"
repo_url: https://huggingface.co/hakurei/lotus-12B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:41:37.836572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:41:37.836572.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:41:37.836572.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T13:41:37.836572.parquet'
- config_name: results
data_files:
- split: 2023_07_18T13_41_37.836572
path:
- results_2023-07-18T13:41:37.836572.parquet
- split: latest
path:
- results_2023-07-18T13:41:37.836572.parquet
---
# Dataset Card for Evaluation run of hakurei/lotus-12B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hakurei/lotus-12B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hakurei/lotus-12B](https://huggingface.co/hakurei/lotus-12B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hakurei__lotus-12B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T13:41:37.836572](https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__lotus-12B/blob/main/results_2023-07-18T13%3A41%3A37.836572.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2485367177317206,
"acc_stderr": 0.03124226591981919,
"acc_norm": 0.25130617872419225,
"acc_norm_stderr": 0.03125347081115218,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.40115476804436745,
"mc2_stderr": 0.014756133562988513
},
"harness|arc:challenge|25": {
"acc": 0.26535836177474403,
"acc_stderr": 0.012902554762313962,
"acc_norm": 0.30716723549488056,
"acc_norm_stderr": 0.013481034054980945
},
"harness|hellaswag|10": {
"acc": 0.4054969129655447,
"acc_stderr": 0.0048998450871831105,
"acc_norm": 0.5270862378012349,
"acc_norm_stderr": 0.004982454383162063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.02737770662467071,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.02737770662467071
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.03001755447188055,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.03001755447188055
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102148,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102148
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462833,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.030532892233932036,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.030532892233932036
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148533,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148533
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18907563025210083,
"acc_stderr": 0.02543511943810535,
"acc_norm": 0.18907563025210083,
"acc_norm_stderr": 0.02543511943810535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.032162984205936135,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.032162984205936135
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.031911001928357954,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.031911001928357954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591206,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.04656147110012352,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.04656147110012352
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392916,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02428861946604612,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02428861946604612
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.02472386150477169,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.02472386150477169
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799204,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799204
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461004,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002205,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142317,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528037,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528037
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348387,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348387
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.40115476804436745,
"mc2_stderr": 0.014756133562988513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_hakurei__instruct-12b | 2023-08-27T12:38:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of hakurei/instruct-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hakurei/instruct-12b](https://huggingface.co/hakurei/instruct-12b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hakurei__instruct-12b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T18:10:16.385807](https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__instruct-12b/blob/main/results_2023-07-19T18%3A10%3A16.385807.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27403154507818933,\n\
\ \"acc_stderr\": 0.032179983157039,\n \"acc_norm\": 0.277374636621042,\n\
\ \"acc_norm_stderr\": 0.03217892628412224,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506983,\n \"mc2\": 0.3196486720150373,\n\
\ \"mc2_stderr\": 0.013605255058273893\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325717,\n\
\ \"acc_norm\": 0.4257679180887372,\n \"acc_norm_stderr\": 0.014449464278868802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5104560844453296,\n\
\ \"acc_stderr\": 0.004988690229505662,\n \"acc_norm\": 0.6675960963951404,\n\
\ \"acc_norm_stderr\": 0.004701121421805423\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118352,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118352\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029471,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029471\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.037082846624165444,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.037082846624165444\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.2870967741935484,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3686868686868687,\n \"acc_stderr\": 0.034373055019806184,\n \"\
acc_norm\": 0.3686868686868687,\n \"acc_norm_stderr\": 0.034373055019806184\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n\
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547805,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.29535864978902954,\n \"acc_stderr\": 0.02969633871342289,\n \"\
acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.02969633871342289\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.028568079464714277,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.028568079464714277\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2988505747126437,\n\
\ \"acc_stderr\": 0.016369256815093117,\n \"acc_norm\": 0.2988505747126437,\n\
\ \"acc_norm_stderr\": 0.016369256815093117\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537375,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2816166883963494,\n\
\ \"acc_stderr\": 0.011487783272786696,\n \"acc_norm\": 0.2816166883963494,\n\
\ \"acc_norm_stderr\": 0.011487783272786696\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2875816993464052,\n \"acc_stderr\": 0.018311653053648222,\n \
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.018311653053648222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.0430911870994646,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.0430911870994646\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071855,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071855\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506983,\n \"mc2\": 0.3196486720150373,\n\
\ \"mc2_stderr\": 0.013605255058273893\n }\n}\n```"
repo_url: https://huggingface.co/hakurei/instruct-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:10:16.385807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:10:16.385807.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:10:16.385807.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:10:16.385807.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_10_16.385807
path:
- results_2023-07-19T18:10:16.385807.parquet
- split: latest
path:
- results_2023-07-19T18:10:16.385807.parquet
---
# Dataset Card for Evaluation run of hakurei/instruct-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hakurei/instruct-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hakurei/instruct-12b](https://huggingface.co/hakurei/instruct-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hakurei__instruct-12b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T18:10:16.385807](https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__instruct-12b/blob/main/results_2023-07-19T18%3A10%3A16.385807.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27403154507818933,
"acc_stderr": 0.032179983157039,
"acc_norm": 0.277374636621042,
"acc_norm_stderr": 0.03217892628412224,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506983,
"mc2": 0.3196486720150373,
"mc2_stderr": 0.013605255058273893
},
"harness|arc:challenge|25": {
"acc": 0.3856655290102389,
"acc_stderr": 0.01422425097325717,
"acc_norm": 0.4257679180887372,
"acc_norm_stderr": 0.014449464278868802
},
"harness|hellaswag|10": {
"acc": 0.5104560844453296,
"acc_stderr": 0.004988690229505662,
"acc_norm": 0.6675960963951404,
"acc_norm_stderr": 0.004701121421805423
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118352,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118352
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029471,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029471
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.037082846624165444,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.037082846624165444
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3686868686868687,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.3686868686868687,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547805,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.02969633871342289,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.02969633871342289
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.028568079464714277,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.028568079464714277
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2988505747126437,
"acc_stderr": 0.016369256815093117,
"acc_norm": 0.2988505747126437,
"acc_norm_stderr": 0.016369256815093117
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2816166883963494,
"acc_stderr": 0.011487783272786696,
"acc_norm": 0.2816166883963494,
"acc_norm_stderr": 0.011487783272786696
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.0430911870994646,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.0430911870994646
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071855,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071855
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506983,
"mc2": 0.3196486720150373,
"mc2_stderr": 0.013605255058273893
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_llama-anon__instruct-13b | 2023-09-17T02:24:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llama-anon/instruct-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llama-anon/instruct-13b](https://huggingface.co/llama-anon/instruct-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llama-anon__instruct-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:24:06.962063](https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b/blob/main/results_2023-09-17T02-24-06.962063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31438758389261745,\n\
\ \"em_stderr\": 0.004754574768123327,\n \"f1\": 0.3769809144295322,\n\
\ \"f1_stderr\": 0.004680725874888402,\n \"acc\": 0.37917019961428294,\n\
\ \"acc_stderr\": 0.00825067276736675\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31438758389261745,\n \"em_stderr\": 0.004754574768123327,\n\
\ \"f1\": 0.3769809144295322,\n \"f1_stderr\": 0.004680725874888402\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.004106620637749704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llama-anon/instruct-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|drop|3_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-24-06.962063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-24-06.962063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|winogrande|5_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-24-06.962063.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- results_2023-07-19T18:48:36.816075.parquet
- split: 2023_09_17T02_24_06.962063
path:
- results_2023-09-17T02-24-06.962063.parquet
- split: latest
path:
- results_2023-09-17T02-24-06.962063.parquet
---
# Dataset Card for Evaluation run of llama-anon/instruct-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llama-anon/instruct-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llama-anon/instruct-13b](https://huggingface.co/llama-anon/instruct-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llama-anon__instruct-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:24:06.962063](https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b/blob/main/results_2023-09-17T02-24-06.962063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31438758389261745,
"em_stderr": 0.004754574768123327,
"f1": 0.3769809144295322,
"f1_stderr": 0.004680725874888402,
"acc": 0.37917019961428294,
"acc_stderr": 0.00825067276736675
},
"harness|drop|3": {
"em": 0.31438758389261745,
"em_stderr": 0.004754574768123327,
"f1": 0.3769809144295322,
"f1_stderr": 0.004680725874888402
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749704
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview | 2023-09-22T13:47:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of danielhanchen/open_llama_3b_600bt_preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [danielhanchen/open_llama_3b_600bt_preview](https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:47:34.979572](https://huggingface.co/datasets/open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview/blob/main/results_2023-09-22T13-47-34.979572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413175,\n \"f1\": 0.04996329697986588,\n\
\ \"f1_stderr\": 0.0012567293128089149,\n \"acc\": 0.32150142444857593,\n\
\ \"acc_stderr\": 0.007826931083969837\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413175,\n\
\ \"f1\": 0.04996329697986588,\n \"f1_stderr\": 0.0012567293128089149\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.002138670301460455\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6369376479873717,\n \"acc_stderr\": 0.01351519186647922\n\
\ }\n}\n```"
repo_url: https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|drop|3_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-47-34.979572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-47-34.979572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:00:20.394414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:00:20.394414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_47_34.979572
path:
- '**/details_harness|winogrande|5_2023-09-22T13-47-34.979572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-47-34.979572.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_00_20.394414
path:
- results_2023-07-19T15:00:20.394414.parquet
- split: 2023_09_22T13_47_34.979572
path:
- results_2023-09-22T13-47-34.979572.parquet
- split: latest
path:
- results_2023-09-22T13-47-34.979572.parquet
---
# Dataset Card for Evaluation run of danielhanchen/open_llama_3b_600bt_preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [danielhanchen/open_llama_3b_600bt_preview](https://huggingface.co/danielhanchen/open_llama_3b_600bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:47:34.979572](https://huggingface.co/datasets/open-llm-leaderboard/details_danielhanchen__open_llama_3b_600bt_preview/blob/main/results_2023-09-22T13-47-34.979572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413175,
"f1": 0.04996329697986588,
"f1_stderr": 0.0012567293128089149,
"acc": 0.32150142444857593,
"acc_stderr": 0.007826931083969837
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413175,
"f1": 0.04996329697986588,
"f1_stderr": 0.0012567293128089149
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.002138670301460455
},
"harness|winogrande|5": {
"acc": 0.6369376479873717,
"acc_stderr": 0.01351519186647922
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_elinas__chronos-13b-v2 | 2023-09-23T04:36:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elinas/chronos-13b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos-13b-v2](https://huggingface.co/elinas/chronos-13b-v2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos-13b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T04:36:37.987786](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-13b-v2/blob/main/results_2023-09-23T04-36-37.987786.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004928691275167785,\n\
\ \"em_stderr\": 0.0007171872517059757,\n \"f1\": 0.06743498322147651,\n\
\ \"f1_stderr\": 0.0015375167645534094,\n \"acc\": 0.4317781582158161,\n\
\ \"acc_stderr\": 0.01043976411288187\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004928691275167785,\n \"em_stderr\": 0.0007171872517059757,\n\
\ \"f1\": 0.06743498322147651,\n \"f1_stderr\": 0.0015375167645534094\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.00871933902883305\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930689\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos-13b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T04_36_37.987786
path:
- '**/details_harness|drop|3_2023-09-23T04-36-37.987786.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T04-36-37.987786.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T04_36_37.987786
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-36-37.987786.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-36-37.987786.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:27:13.530868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:27:13.530868.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:27:13.530868.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T04_36_37.987786
path:
- '**/details_harness|winogrande|5_2023-09-23T04-36-37.987786.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T04-36-37.987786.parquet'
- config_name: results
data_files:
- split: 2023_08_09T10_27_13.530868
path:
- results_2023-08-09T10:27:13.530868.parquet
- split: 2023_09_23T04_36_37.987786
path:
- results_2023-09-23T04-36-37.987786.parquet
- split: latest
path:
- results_2023-09-23T04-36-37.987786.parquet
---
# Dataset Card for Evaluation run of elinas/chronos-13b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elinas/chronos-13b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elinas/chronos-13b-v2](https://huggingface.co/elinas/chronos-13b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos-13b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T04:36:37.987786](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-13b-v2/blob/main/results_2023-09-23T04-36-37.987786.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004928691275167785,
"em_stderr": 0.0007171872517059757,
"f1": 0.06743498322147651,
"f1_stderr": 0.0015375167645534094,
"acc": 0.4317781582158161,
"acc_stderr": 0.01043976411288187
},
"harness|drop|3": {
"em": 0.004928691275167785,
"em_stderr": 0.0007171872517059757,
"f1": 0.06743498322147651,
"f1_stderr": 0.0015375167645534094
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.00871933902883305
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930689
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_elinas__chronos-33b | 2023-08-27T12:38:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elinas/chronos-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos-33b](https://huggingface.co/elinas/chronos-33b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos-33b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-20T14:01:11.905488](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-33b/blob/main/results_2023-07-20T14%3A01%3A11.905488.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5604776654669718,\n\
\ \"acc_stderr\": 0.03437324033891388,\n \"acc_norm\": 0.5644298693575048,\n\
\ \"acc_norm_stderr\": 0.03435109916242029,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.46666811510165307,\n\
\ \"mc2_stderr\": 0.014629625171707208\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642667,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6340370444134634,\n\
\ \"acc_stderr\": 0.004807146925162056,\n \"acc_norm\": 0.8347938657637921,\n\
\ \"acc_norm_stderr\": 0.0037060751843802867\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777475,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777475\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762623,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762623\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300642,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300642\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.03247390276569669,\n \
\ \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.026152198619726792,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.026152198619726792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520439,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520439\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.02803609227389177,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.02803609227389177\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971628,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971628\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.020017629214213094,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.020017629214213094\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872478,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872478\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.46666811510165307,\n\
\ \"mc2_stderr\": 0.014629625171707208\n }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|arc:challenge|25_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hellaswag|10_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:11.905488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:11.905488.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T14:01:11.905488.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T14:01:11.905488.parquet'
- config_name: results
data_files:
- split: 2023_07_20T14_01_11.905488
path:
- results_2023-07-20T14:01:11.905488.parquet
- split: latest
path:
- results_2023-07-20T14:01:11.905488.parquet
---
# Dataset Card for Evaluation run of elinas/chronos-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elinas/chronos-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elinas/chronos-33b](https://huggingface.co/elinas/chronos-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos-33b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-20T14:01:11.905488](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-33b/blob/main/results_2023-07-20T14%3A01%3A11.905488.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5604776654669718,
"acc_stderr": 0.03437324033891388,
"acc_norm": 0.5644298693575048,
"acc_norm_stderr": 0.03435109916242029,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.46666811510165307,
"mc2_stderr": 0.014629625171707208
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642667,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.6340370444134634,
"acc_stderr": 0.004807146925162056,
"acc_norm": 0.8347938657637921,
"acc_norm_stderr": 0.0037060751843802867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777475,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777475
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762623,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762623
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5084033613445378,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.5084033613445378,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.026152198619726792,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.026152198619726792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520439,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520439
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.02803609227389177,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.02803609227389177
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971628,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971628
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.020017629214213094,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.020017629214213094
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872478,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872478
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355575,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355575
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.46666811510165307,
"mc2_stderr": 0.014629625171707208
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_voidful__changpt-bart | 2023-09-17T22:50:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of voidful/changpt-bart
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [voidful/changpt-bart](https://huggingface.co/voidful/changpt-bart) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_voidful__changpt-bart\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T22:50:05.507806](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__changpt-bart/blob/main/results_2023-09-17T22-50-05.507806.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.2474348855564325,\n \"acc_stderr\": 0.007025872980895256\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.494869771112865,\n \"acc_stderr\": 0.014051745961790513\n\
\ }\n}\n```"
repo_url: https://huggingface.co/voidful/changpt-bart
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|drop|3_2023-09-17T22-50-05.507806.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T22-50-05.507806.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-50-05.507806.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-50-05.507806.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|winogrande|5_2023-09-17T22-50-05.507806.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T22-50-05.507806.parquet'
- config_name: results
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- results_2023-08-09T19:52:50.972620.parquet
- split: 2023_08_09T19_53_13.918423
path:
- results_2023-08-09T19:53:13.918423.parquet
- split: 2023_09_17T22_50_05.507806
path:
- results_2023-09-17T22-50-05.507806.parquet
- split: latest
path:
- results_2023-09-17T22-50-05.507806.parquet
---
# Dataset Card for Evaluation run of voidful/changpt-bart
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/voidful/changpt-bart
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [voidful/changpt-bart](https://huggingface.co/voidful/changpt-bart) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_voidful__changpt-bart",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:50:05.507806](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__changpt-bart/blob/main/results_2023-09-17T22-50-05.507806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.2474348855564325,
"acc_stderr": 0.007025872980895256
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.494869771112865,
"acc_stderr": 0.014051745961790513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_eachadea__vicuna-13b-1.1 | 2023-08-27T12:38:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of eachadea/vicuna-13b-1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eachadea/vicuna-13b-1.1](https://huggingface.co/eachadea/vicuna-13b-1.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eachadea__vicuna-13b-1.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T18:54:56.836268](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b-1.1/blob/main/results_2023-07-19T18%3A54%3A56.836268.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5207458541981249,\n\
\ \"acc_stderr\": 0.03494058387309796,\n \"acc_norm\": 0.5242752921426072,\n\
\ \"acc_norm_stderr\": 0.03492505643523372,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5207836984948891,\n\
\ \"mc2_stderr\": 0.01580678689190342\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5196245733788396,\n \"acc_stderr\": 0.014600132075947094,\n\
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985996\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6007767377016531,\n\
\ \"acc_stderr\": 0.004887378682406532,\n \"acc_norm\": 0.8013343955387373,\n\
\ \"acc_norm_stderr\": 0.003981802822377587\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.567741935483871,\n \"acc_stderr\": 0.028181739720019416,\n \"\
acc_norm\": 0.567741935483871,\n \"acc_norm_stderr\": 0.028181739720019416\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n \"\
acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.0253106392549339,\n \
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.0253106392549339\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n\
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469116,\n \"\
acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469116\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.03256685484460388,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.03256685484460388\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564386,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564386\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6922094508301405,\n\
\ \"acc_stderr\": 0.016506045045155637,\n \"acc_norm\": 0.6922094508301405,\n\
\ \"acc_norm_stderr\": 0.016506045045155637\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.026803720583206177,\n\
\ \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.026803720583206177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4165580182529335,\n\
\ \"acc_stderr\": 0.012591153245057383,\n \"acc_norm\": 0.4165580182529335,\n\
\ \"acc_norm_stderr\": 0.012591153245057383\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600248,\n \
\ \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600248\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5207836984948891,\n\
\ \"mc2_stderr\": 0.01580678689190342\n }\n}\n```"
repo_url: https://huggingface.co/eachadea/vicuna-13b-1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:54:56.836268.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:54:56.836268.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:54:56.836268.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:54:56.836268.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_54_56.836268
path:
- results_2023-07-19T18:54:56.836268.parquet
- split: latest
path:
- results_2023-07-19T18:54:56.836268.parquet
---
# Dataset Card for Evaluation run of eachadea/vicuna-13b-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/eachadea/vicuna-13b-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [eachadea/vicuna-13b-1.1](https://huggingface.co/eachadea/vicuna-13b-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eachadea__vicuna-13b-1.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T18:54:56.836268](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b-1.1/blob/main/results_2023-07-19T18%3A54%3A56.836268.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5207458541981249,
"acc_stderr": 0.03494058387309796,
"acc_norm": 0.5242752921426072,
"acc_norm_stderr": 0.03492505643523372,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5207836984948891,
"mc2_stderr": 0.01580678689190342
},
"harness|arc:challenge|25": {
"acc": 0.5196245733788396,
"acc_stderr": 0.014600132075947094,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985996
},
"harness|hellaswag|10": {
"acc": 0.6007767377016531,
"acc_stderr": 0.004887378682406532,
"acc_norm": 0.8013343955387373,
"acc_norm_stderr": 0.003981802822377587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019416,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019416
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.0253106392549339,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.0253106392549339
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6862385321100918,
"acc_stderr": 0.019894723341469116,
"acc_norm": 0.6862385321100918,
"acc_norm_stderr": 0.019894723341469116
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564386,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6922094508301405,
"acc_stderr": 0.016506045045155637,
"acc_norm": 0.6922094508301405,
"acc_norm_stderr": 0.016506045045155637
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.546242774566474,
"acc_stderr": 0.026803720583206177,
"acc_norm": 0.546242774566474,
"acc_norm_stderr": 0.026803720583206177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4165580182529335,
"acc_stderr": 0.012591153245057383,
"acc_norm": 0.4165580182529335,
"acc_norm_stderr": 0.012591153245057383
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.020209572388600248,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.020209572388600248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5207836984948891,
"mc2_stderr": 0.01580678689190342
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_eachadea__vicuna-7b-1.1 | 2023-09-22T23:37:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of eachadea/vicuna-7b-1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eachadea/vicuna-7b-1.1](https://huggingface.co/eachadea/vicuna-7b-1.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eachadea__vicuna-7b-1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T23:37:12.210643](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-7b-1.1/blob/main/results_2023-09-22T23-37-12.210643.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11388422818791946,\n\
\ \"em_stderr\": 0.00325324428862373,\n \"f1\": 0.16976719798657605,\n\
\ \"f1_stderr\": 0.003380156230610554,\n \"acc\": 0.38244753834582057,\n\
\ \"acc_stderr\": 0.009528517622122097\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.11388422818791946,\n \"em_stderr\": 0.00325324428862373,\n\
\ \"f1\": 0.16976719798657605,\n \"f1_stderr\": 0.003380156230610554\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.006298221796179588\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.012758813448064607\n\
\ }\n}\n```"
repo_url: https://huggingface.co/eachadea/vicuna-7b-1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T23_37_12.210643
path:
- '**/details_harness|drop|3_2023-09-22T23-37-12.210643.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T23-37-12.210643.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T23_37_12.210643
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-37-12.210643.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-37-12.210643.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:22:46.451039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:22:46.451039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:22:46.451039.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T23_37_12.210643
path:
- '**/details_harness|winogrande|5_2023-09-22T23-37-12.210643.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T23-37-12.210643.parquet'
- config_name: results
data_files:
- split: 2023_07_18T12_22_46.451039
path:
- results_2023-07-18T12:22:46.451039.parquet
- split: 2023_09_22T23_37_12.210643
path:
- results_2023-09-22T23-37-12.210643.parquet
- split: latest
path:
- results_2023-09-22T23-37-12.210643.parquet
---
# Dataset Card for Evaluation run of eachadea/vicuna-7b-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/eachadea/vicuna-7b-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [eachadea/vicuna-7b-1.1](https://huggingface.co/eachadea/vicuna-7b-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eachadea__vicuna-7b-1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T23:37:12.210643](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-7b-1.1/blob/main/results_2023-09-22T23-37-12.210643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11388422818791946,
"em_stderr": 0.00325324428862373,
"f1": 0.16976719798657605,
"f1_stderr": 0.003380156230610554,
"acc": 0.38244753834582057,
"acc_stderr": 0.009528517622122097
},
"harness|drop|3": {
"em": 0.11388422818791946,
"em_stderr": 0.00325324428862373,
"f1": 0.16976719798657605,
"f1_stderr": 0.003380156230610554
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179588
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.012758813448064607
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_eachadea__vicuna-13b | 2023-08-27T12:38:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of eachadea/vicuna-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eachadea/vicuna-13b](https://huggingface.co/eachadea/vicuna-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eachadea__vicuna-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T14:25:52.300291](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b/blob/main/results_2023-07-18T14%3A25%3A52.300291.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5098692105322622,\n\
\ \"acc_stderr\": 0.03500619771332418,\n \"acc_norm\": 0.5134393492866836,\n\
\ \"acc_norm_stderr\": 0.03499106655706028,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.526792499290547,\n\
\ \"mc2_stderr\": 0.015728661776034537\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056987,\n\
\ \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536597\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6041625174268074,\n\
\ \"acc_stderr\": 0.004880303863138503,\n \"acc_norm\": 0.7994423421629158,\n\
\ \"acc_norm_stderr\": 0.003995992960088771\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.03077265364207567,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.03077265364207567\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.02380952380952387,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.02380952380952387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317216,\n \"\
acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317216\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6844036697247706,\n \"acc_stderr\": 0.019926117513869666,\n \"\
acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.019926117513869666\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.685823754789272,\n\
\ \"acc_stderr\": 0.0165992917358849,\n \"acc_norm\": 0.685823754789272,\n\
\ \"acc_norm_stderr\": 0.0165992917358849\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494368,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494368\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010068,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010068\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197608,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197608\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n\
\ \"acc_stderr\": 0.012467564418145135,\n \"acc_norm\": 0.3917861799217731,\n\
\ \"acc_norm_stderr\": 0.012467564418145135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5163398692810458,\n \"acc_stderr\": 0.020217030653186467,\n \
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.020217030653186467\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919796,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919796\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.526792499290547,\n\
\ \"mc2_stderr\": 0.015728661776034537\n }\n}\n```"
repo_url: https://huggingface.co/eachadea/vicuna-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:25:52.300291.parquet'
- config_name: results
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- results_2023-07-18T14:25:52.300291.parquet
- split: latest
path:
- results_2023-07-18T14:25:52.300291.parquet
---
# Dataset Card for Evaluation run of eachadea/vicuna-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/eachadea/vicuna-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [eachadea/vicuna-13b](https://huggingface.co/eachadea/vicuna-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eachadea__vicuna-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T14:25:52.300291](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b/blob/main/results_2023-07-18T14%3A25%3A52.300291.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5098692105322622,
"acc_stderr": 0.03500619771332418,
"acc_norm": 0.5134393492866836,
"acc_norm_stderr": 0.03499106655706028,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.526792499290547,
"mc2_stderr": 0.015728661776034537
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056987,
"acc_norm": 0.5170648464163823,
"acc_norm_stderr": 0.014602878388536597
},
"harness|hellaswag|10": {
"acc": 0.6041625174268074,
"acc_stderr": 0.004880303863138503,
"acc_norm": 0.7994423421629158,
"acc_norm_stderr": 0.003995992960088771
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.03077265364207567,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.03077265364207567
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.02380952380952387,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.02380952380952387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317216,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317216
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514565,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196687,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196687
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.685823754789272,
"acc_stderr": 0.0165992917358849,
"acc_norm": 0.685823754789272,
"acc_norm_stderr": 0.0165992917358849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.02670054542494368,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.02670054542494368
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010068,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010068
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197608,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197608
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3917861799217731,
"acc_stderr": 0.012467564418145135,
"acc_norm": 0.3917861799217731,
"acc_norm_stderr": 0.012467564418145135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.020217030653186467,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.020217030653186467
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919796,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919796
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.526792499290547,
"mc2_stderr": 0.015728661776034537
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ashercn97__giraffe-7b | 2023-09-22T20:53:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ashercn97/giraffe-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ashercn97/giraffe-7b](https://huggingface.co/ashercn97/giraffe-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ashercn97__giraffe-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:53:47.065964](https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__giraffe-7b/blob/main/results_2023-09-22T20-53-47.065964.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00388003355704698,\n\
\ \"em_stderr\": 0.0006366682825520032,\n \"f1\": 0.06388317953020159,\n\
\ \"f1_stderr\": 0.0014760537495948263,\n \"acc\": 0.3581768614021409,\n\
\ \"acc_stderr\": 0.008713750066062537\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825520032,\n\
\ \"f1\": 0.06388317953020159,\n \"f1_stderr\": 0.0014760537495948263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \
\ \"acc_stderr\": 0.004427045987265172\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6898184688239937,\n \"acc_stderr\": 0.013000454144859902\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ashercn97/giraffe-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|arc:challenge|25_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|drop|3_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-53-47.065964.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-53-47.065964.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hellaswag|10_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|winogrande|5_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-53-47.065964.parquet'
- config_name: results
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- results_2023-08-02T15:44:19.746565.parquet
- split: 2023_09_22T20_53_47.065964
path:
- results_2023-09-22T20-53-47.065964.parquet
- split: latest
path:
- results_2023-09-22T20-53-47.065964.parquet
---
# Dataset Card for Evaluation run of ashercn97/giraffe-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ashercn97/giraffe-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ashercn97/giraffe-7b](https://huggingface.co/ashercn97/giraffe-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ashercn97__giraffe-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:53:47.065964](https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__giraffe-7b/blob/main/results_2023-09-22T20-53-47.065964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825520032,
"f1": 0.06388317953020159,
"f1_stderr": 0.0014760537495948263,
"acc": 0.3581768614021409,
"acc_stderr": 0.008713750066062537
},
"harness|drop|3": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825520032,
"f1": 0.06388317953020159,
"f1_stderr": 0.0014760537495948263
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.004427045987265172
},
"harness|winogrande|5": {
"acc": 0.6898184688239937,
"acc_stderr": 0.013000454144859902
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-abstract_algebra-neg-prepend-fix | 2023-08-21T07:31:10.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4909
num_examples: 5
- name: test
num_bytes: 196242
num_examples: 100
download_size: 11253
dataset_size: 201151
---
# Dataset Card for "mmlu-abstract_algebra-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-anatomy-neg-prepend-fix | 2023-08-21T07:31:23.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4622
num_examples: 5
- name: test
num_bytes: 277961
num_examples: 135
download_size: 11502
dataset_size: 282583
---
# Dataset Card for "mmlu-anatomy-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B | 2023-08-27T12:38:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-1.3B](https://huggingface.co/cerebras/Cerebras-GPT-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T11:08:05.365000](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B/blob/main/results_2023-07-18T11%3A08%3A05.365000.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26651172140766793,\n\
\ \"acc_stderr\": 0.03193977477898276,\n \"acc_norm\": 0.26790088316822264,\n\
\ \"acc_norm_stderr\": 0.03194994540006838,\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.4269871364718513,\n\
\ \"mc2_stderr\": 0.014897248723095273\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23720136518771331,\n \"acc_stderr\": 0.01243039982926084,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.32901812387970525,\n\
\ \"acc_stderr\": 0.004688963175758139,\n \"acc_norm\": 0.385381398127863,\n\
\ \"acc_norm_stderr\": 0.004856906473719379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678316,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678316\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.02725726032249485,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.02725726032249485\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628806,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628806\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217893,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2032258064516129,\n\
\ \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.2032258064516129,\n\
\ \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292965,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292965\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3230769230769231,\n \"acc_stderr\": 0.023710888501970562,\n\
\ \"acc_norm\": 0.3230769230769231,\n \"acc_norm_stderr\": 0.023710888501970562\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882364,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599659,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599659\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"\
acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n\
\ \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.22058823529411764,\n\
\ \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598014,\n\
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598014\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23318385650224216,\n\
\ \"acc_stderr\": 0.028380391147094716,\n \"acc_norm\": 0.23318385650224216,\n\
\ \"acc_norm_stderr\": 0.028380391147094716\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891155,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891155\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686929,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686929\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279042,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279042\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114023,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114023\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n\
\ \"acc_stderr\": 0.010875700787694242,\n \"acc_norm\": 0.2379400260756193,\n\
\ \"acc_norm_stderr\": 0.010875700787694242\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.018120224251484587,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.018120224251484587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.16363636363636364,\n\
\ \"acc_stderr\": 0.03543433054298678,\n \"acc_norm\": 0.16363636363636364,\n\
\ \"acc_norm_stderr\": 0.03543433054298678\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.027212835884073142,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.027212835884073142\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772422,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772422\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.4269871364718513,\n\
\ \"mc2_stderr\": 0.014897248723095273\n }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:08:05.365000.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- results_2023-07-18T11:08:05.365000.parquet
- split: latest
path:
- results_2023-07-18T11:08:05.365000.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-1.3B](https://huggingface.co/cerebras/Cerebras-GPT-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T11:08:05.365000](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B/blob/main/results_2023-07-18T11%3A08%3A05.365000.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26651172140766793,
"acc_stderr": 0.03193977477898276,
"acc_norm": 0.26790088316822264,
"acc_norm_stderr": 0.03194994540006838,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.01505186948671501,
"mc2": 0.4269871364718513,
"mc2_stderr": 0.014897248723095273
},
"harness|arc:challenge|25": {
"acc": 0.23720136518771331,
"acc_stderr": 0.01243039982926084,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.32901812387970525,
"acc_stderr": 0.004688963175758139,
"acc_norm": 0.385381398127863,
"acc_norm_stderr": 0.004856906473719379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678316,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678316
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.02725726032249485,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.02725726032249485
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628806,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628806
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217893,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292965,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292965
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3230769230769231,
"acc_stderr": 0.023710888501970562,
"acc_norm": 0.3230769230769231,
"acc_norm_stderr": 0.023710888501970562
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882364,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599659,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599659
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3431192660550459,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.3431192660550459,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.02910225438967409,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.02910225438967409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598014,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598014
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23318385650224216,
"acc_stderr": 0.028380391147094716,
"acc_norm": 0.23318385650224216,
"acc_norm_stderr": 0.028380391147094716
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969174,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969174
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891155,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891155
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686929,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686929
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.026090162504279042,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.026090162504279042
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114023,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114023
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.010875700787694242,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.010875700787694242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.018120224251484587,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.018120224251484587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.16363636363636364,
"acc_stderr": 0.03543433054298678,
"acc_norm": 0.16363636363636364,
"acc_norm_stderr": 0.03543433054298678
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772422,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772422
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.01505186948671501,
"mc2": 0.4269871364718513,
"mc2_stderr": 0.014897248723095273
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B | 2023-08-27T12:38:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-13B](https://huggingface.co/cerebras/Cerebras-GPT-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T19:05:05.976819](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B/blob/main/results_2023-07-19T19%3A05%3A05.976819.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26379746204985755,\n\
\ \"acc_stderr\": 0.031825642312569465,\n \"acc_norm\": 0.2670819405092585,\n\
\ \"acc_norm_stderr\": 0.03183066886310831,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.39185464744654125,\n\
\ \"mc2_stderr\": 0.013884078720404066\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3378839590443686,\n \"acc_stderr\": 0.013822047922283505,\n\
\ \"acc_norm\": 0.38139931740614336,\n \"acc_norm_stderr\": 0.014194389086685256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44981079466241786,\n\
\ \"acc_stderr\": 0.00496457968571244,\n \"acc_norm\": 0.6000796654052978,\n\
\ \"acc_norm_stderr\": 0.004888805003103053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066656,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066656\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241235,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514175,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514175\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415422,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415422\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292982,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292982\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860674,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.02361088430892786,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28256880733944956,\n \"acc_stderr\": 0.019304243497707152,\n \"\
acc_norm\": 0.28256880733944956,\n \"acc_norm_stderr\": 0.019304243497707152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224622,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224622\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n\
\ \"acc_stderr\": 0.016117318166832272,\n \"acc_norm\": 0.2835249042145594,\n\
\ \"acc_norm_stderr\": 0.016117318166832272\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.01489339173524959,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.01489339173524959\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290385,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290385\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441898,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.023886881922440362,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.023886881922440362\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3551020408163265,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.3551020408163265,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573005,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573005\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.03591566797824662,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.03591566797824662\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.39185464744654125,\n\
\ \"mc2_stderr\": 0.013884078720404066\n }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:05.976819.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:05.976819.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:05.976819.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:05.976819.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_05_05.976819
path:
- results_2023-07-19T19:05:05.976819.parquet
- split: latest
path:
- results_2023-07-19T19:05:05.976819.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-13B](https://huggingface.co/cerebras/Cerebras-GPT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T19:05:05.976819](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B/blob/main/results_2023-07-19T19%3A05%3A05.976819.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26379746204985755,
"acc_stderr": 0.031825642312569465,
"acc_norm": 0.2670819405092585,
"acc_norm_stderr": 0.03183066886310831,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.39185464744654125,
"mc2_stderr": 0.013884078720404066
},
"harness|arc:challenge|25": {
"acc": 0.3378839590443686,
"acc_stderr": 0.013822047922283505,
"acc_norm": 0.38139931740614336,
"acc_norm_stderr": 0.014194389086685256
},
"harness|hellaswag|10": {
"acc": 0.44981079466241786,
"acc_stderr": 0.00496457968571244,
"acc_norm": 0.6000796654052978,
"acc_norm_stderr": 0.004888805003103053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066656,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066656
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241235,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514175,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514175
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415422,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415422
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292982,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292982
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860674,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28256880733944956,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.28256880733944956,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224622,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224622
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832272,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832272
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.01489339173524959,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.01489339173524959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290385,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290385
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441898,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.023886881922440362,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.023886881922440362
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3551020408163265,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.3551020408163265,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573005,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573005
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.03591566797824662,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.03591566797824662
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.39185464744654125,
"mc2_stderr": 0.013884078720404066
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-astronomy-neg-prepend-fix | 2023-08-21T07:31:36.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7144
num_examples: 5
- name: test
num_bytes: 498344
num_examples: 152
download_size: 15249
dataset_size: 505488
---
# Dataset Card for "mmlu-astronomy-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M | 2023-09-22T19:15:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-111M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-111M](https://huggingface.co/cerebras/Cerebras-GPT-111M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:15:45.776483](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M/blob/main/results_2023-09-22T19-15-45.776483.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652193176,\n \"f1\": 0.021427223154362497,\n\
\ \"f1_stderr\": 0.0008720566428263053,\n \"acc\": 0.23875295974743488,\n\
\ \"acc_stderr\": 0.00701912891202994\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652193176,\n\
\ \"f1\": 0.021427223154362497,\n \"f1_stderr\": 0.0008720566428263053\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.47750591949486976,\n\
\ \"acc_stderr\": 0.01403825782405988\n }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-111M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_15_45.776483
path:
- '**/details_harness|drop|3_2023-09-22T19-15-45.776483.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-15-45.776483.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_15_45.776483
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-15-45.776483.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-15-45.776483.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:47:12.878137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:47:12.878137.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:47:12.878137.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_15_45.776483
path:
- '**/details_harness|winogrande|5_2023-09-22T19-15-45.776483.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-15-45.776483.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_47_12.878137
path:
- results_2023-07-19T13:47:12.878137.parquet
- split: 2023_09_22T19_15_45.776483
path:
- results_2023-09-22T19-15-45.776483.parquet
- split: latest
path:
- results_2023-09-22T19-15-45.776483.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-111M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-111M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-111M](https://huggingface.co/cerebras/Cerebras-GPT-111M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:15:45.776483](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M/blob/main/results_2023-09-22T19-15-45.776483.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193176,
"f1": 0.021427223154362497,
"f1_stderr": 0.0008720566428263053,
"acc": 0.23875295974743488,
"acc_stderr": 0.00701912891202994
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193176,
"f1": 0.021427223154362497,
"f1_stderr": 0.0008720566428263053
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.47750591949486976,
"acc_stderr": 0.01403825782405988
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-business_ethics-neg-prepend-fix | 2023-08-21T07:31:48.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 8912
num_examples: 5
- name: test
num_bytes: 380226
num_examples: 100
download_size: 17137
dataset_size: 389138
---
# Dataset Card for "mmlu-business_ethics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B | 2023-08-27T12:38:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-6.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-6.7B](https://huggingface.co/cerebras/Cerebras-GPT-6.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:33:57.181673](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B/blob/main/results_2023-07-19T16%3A33%3A57.181673.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2632743916863238,\n\
\ \"acc_stderr\": 0.0318069720420694,\n \"acc_norm\": 0.2664995658023207,\n\
\ \"acc_norm_stderr\": 0.031813493700888505,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3802394598585255,\n\
\ \"mc2_stderr\": 0.013925842027078916\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344003,\n\
\ \"acc_norm\": 0.3506825938566553,\n \"acc_norm_stderr\": 0.013944635930726087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4451304521011751,\n\
\ \"acc_stderr\": 0.00495964526339023,\n \"acc_norm\": 0.5936068512248556,\n\
\ \"acc_norm_stderr\": 0.00490155813233552\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.13,\n\
\ \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.13,\n \
\ \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415415,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415415\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.028501378167893946,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.028501378167893946\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218974,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218974\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213793,\n\
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213793\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30458715596330277,\n\
\ \"acc_stderr\": 0.01973229942035404,\n \"acc_norm\": 0.30458715596330277,\n\
\ \"acc_norm_stderr\": 0.01973229942035404\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602156,\n\
\ \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602156\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22362869198312235,\n \"acc_stderr\": 0.02712329820522997,\n \
\ \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.02712329820522997\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591203,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591203\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807082,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807082\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604611,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604611\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.024926723224845543,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.024926723224845543\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658523,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658523\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290385,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290385\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26988265971316816,\n\
\ \"acc_stderr\": 0.011337381084250402,\n \"acc_norm\": 0.26988265971316816,\n\
\ \"acc_norm_stderr\": 0.011337381084250402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125474,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3802394598585255,\n\
\ \"mc2_stderr\": 0.013925842027078916\n }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-6.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:57.181673.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- results_2023-07-19T16:33:57.181673.parquet
- split: latest
path:
- results_2023-07-19T16:33:57.181673.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-6.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-6.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-6.7B](https://huggingface.co/cerebras/Cerebras-GPT-6.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:33:57.181673](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B/blob/main/results_2023-07-19T16%3A33%3A57.181673.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2632743916863238,
"acc_stderr": 0.0318069720420694,
"acc_norm": 0.2664995658023207,
"acc_norm_stderr": 0.031813493700888505,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.3802394598585255,
"mc2_stderr": 0.013925842027078916
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.3506825938566553,
"acc_norm_stderr": 0.013944635930726087
},
"harness|hellaswag|10": {
"acc": 0.4451304521011751,
"acc_stderr": 0.00495964526339023,
"acc_norm": 0.5936068512248556,
"acc_norm_stderr": 0.00490155813233552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.13,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.13,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415415,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415415
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.028501378167893946,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.028501378167893946
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511784,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511784
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218974,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218974
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213793,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213793
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30458715596330277,
"acc_stderr": 0.01973229942035404,
"acc_norm": 0.30458715596330277,
"acc_norm_stderr": 0.01973229942035404
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602156,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602156
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.02712329820522997,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.02712329820522997
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591203,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591203
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807082,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807082
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02428861946604611,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02428861946604611
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.024926723224845543,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.024926723224845543
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658523,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658523
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290385,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290385
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26988265971316816,
"acc_stderr": 0.011337381084250402,
"acc_norm": 0.26988265971316816,
"acc_norm_stderr": 0.011337381084250402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.3802394598585255,
"mc2_stderr": 0.013925842027078916
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B | 2023-08-27T12:38:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-2.7B](https://huggingface.co/cerebras/Cerebras-GPT-2.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T16:27:41.831056](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B/blob/main/results_2023-07-19T16%3A27%3A41.831056.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2542414842389138,\n\
\ \"acc_stderr\": 0.03148980376550113,\n \"acc_norm\": 0.2564241922629107,\n\
\ \"acc_norm_stderr\": 0.031497213583879385,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662592,\n \"mc2\": 0.4136763359861922,\n\
\ \"mc2_stderr\": 0.014439422755488887\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2696245733788396,\n \"acc_stderr\": 0.012968040686869148,\n\
\ \"acc_norm\": 0.2909556313993174,\n \"acc_norm_stderr\": 0.013273077865907592\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38548097988448515,\n\
\ \"acc_stderr\": 0.004857140410776741,\n \"acc_norm\": 0.4929296952798247,\n\
\ \"acc_norm_stderr\": 0.004989282516055396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517904,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517904\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02850485647051419,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02850485647051419\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20105820105820105,\n \"acc_stderr\": 0.020641810782370165,\n \"\
acc_norm\": 0.20105820105820105,\n \"acc_norm_stderr\": 0.020641810782370165\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"\
acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"\
acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752947,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868952,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868952\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037183,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037183\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200424,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200424\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n\
\ \"acc_stderr\": 0.015818450894777566,\n \"acc_norm\": 0.2669220945083014,\n\
\ \"acc_norm_stderr\": 0.015818450894777566\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.01487425216809527,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.01487425216809527\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666787,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816632,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816632\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451173,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451173\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n\
\ \"acc_stderr\": 0.011258435537723818,\n \"acc_norm\": 0.26401564537157757,\n\
\ \"acc_norm_stderr\": 0.011258435537723818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.026431329870789534,\n\
\ \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.026431329870789534\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.17272727272727273,\n\
\ \"acc_stderr\": 0.03620691833929219,\n \"acc_norm\": 0.17272727272727273,\n\
\ \"acc_norm_stderr\": 0.03620691833929219\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982052,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982052\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662592,\n \"mc2\": 0.4136763359861922,\n\
\ \"mc2_stderr\": 0.014439422755488887\n }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:41.831056.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- results_2023-07-19T16:27:41.831056.parquet
- split: latest
path:
- results_2023-07-19T16:27:41.831056.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-2.7B](https://huggingface.co/cerebras/Cerebras-GPT-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T16:27:41.831056](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B/blob/main/results_2023-07-19T16%3A27%3A41.831056.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2542414842389138,
"acc_stderr": 0.03148980376550113,
"acc_norm": 0.2564241922629107,
"acc_norm_stderr": 0.031497213583879385,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662592,
"mc2": 0.4136763359861922,
"mc2_stderr": 0.014439422755488887
},
"harness|arc:challenge|25": {
"acc": 0.2696245733788396,
"acc_stderr": 0.012968040686869148,
"acc_norm": 0.2909556313993174,
"acc_norm_stderr": 0.013273077865907592
},
"harness|hellaswag|10": {
"acc": 0.38548097988448515,
"acc_stderr": 0.004857140410776741,
"acc_norm": 0.4929296952798247,
"acc_norm_stderr": 0.004989282516055396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517904,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517904
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02850485647051419,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02850485647051419
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20105820105820105,
"acc_stderr": 0.020641810782370165,
"acc_norm": 0.20105820105820105,
"acc_norm_stderr": 0.020641810782370165
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868952,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969174,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969174
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037183,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037183
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200424,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200424
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.015818450894777566,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.015818450894777566
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.01487425216809527,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.01487425216809527
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816632,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816632
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451173,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451173
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723818,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.026431329870789534,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.026431329870789534
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.17272727272727273,
"acc_stderr": 0.03620691833929219,
"acc_norm": 0.17272727272727273,
"acc_norm_stderr": 0.03620691833929219
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982052,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982052
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662592,
"mc2": 0.4136763359861922,
"mc2_stderr": 0.014439422755488887
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-clinical_knowledge-neg-prepend-fix | 2023-08-21T07:31:59.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5396
num_examples: 5
- name: test
num_bytes: 592677
num_examples: 265
download_size: 12351
dataset_size: 598073
---
# Dataset Card for "mmlu-clinical_knowledge-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gryphe__MythoLogic-13b | 2023-08-27T12:38:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Gryphe/MythoLogic-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoLogic-13b](https://huggingface.co/Gryphe/MythoLogic-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoLogic-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T19:28:00.458162](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-13b/blob/main/results_2023-07-19T19%3A28%3A00.458162.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4968733077227253,\n\
\ \"acc_stderr\": 0.03502668034067957,\n \"acc_norm\": 0.5006345942551945,\n\
\ \"acc_norm_stderr\": 0.035008326455415545,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49469156179831597,\n\
\ \"mc2_stderr\": 0.015422264829446598\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804244,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216384\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6184027086237801,\n\
\ \"acc_stderr\": 0.004847857546957477,\n \"acc_norm\": 0.81557458673571,\n\
\ \"acc_norm_stderr\": 0.0038703811999679567\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775635,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775635\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073845,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073845\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n\
\ \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6605504587155964,\n \"acc_stderr\": 0.02030210934266235,\n \"\
acc_norm\": 0.6605504587155964,\n \"acc_norm_stderr\": 0.02030210934266235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510927,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510927\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914414,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6832694763729247,\n\
\ \"acc_stderr\": 0.016635566427712564,\n \"acc_norm\": 0.6832694763729247,\n\
\ \"acc_norm_stderr\": 0.016635566427712564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574911,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574911\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840622,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n\
\ \"acc_stderr\": 0.012463861839982061,\n \"acc_norm\": 0.39113428943937417,\n\
\ \"acc_norm_stderr\": 0.012463861839982061\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4918300653594771,\n \"acc_stderr\": 0.02022513434305726,\n \
\ \"acc_norm\": 0.4918300653594771,\n \"acc_norm_stderr\": 0.02022513434305726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348641,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348641\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.035087719298245626,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.035087719298245626\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49469156179831597,\n\
\ \"mc2_stderr\": 0.015422264829446598\n }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoLogic-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:00.458162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:00.458162.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:28:00.458162.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:28:00.458162.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_28_00.458162
path:
- results_2023-07-19T19:28:00.458162.parquet
- split: latest
path:
- results_2023-07-19T19:28:00.458162.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoLogic-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoLogic-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoLogic-13b](https://huggingface.co/Gryphe/MythoLogic-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoLogic-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T19:28:00.458162](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-13b/blob/main/results_2023-07-19T19%3A28%3A00.458162.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4968733077227253,
"acc_stderr": 0.03502668034067957,
"acc_norm": 0.5006345942551945,
"acc_norm_stderr": 0.035008326455415545,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49469156179831597,
"mc2_stderr": 0.015422264829446598
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804244,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216384
},
"harness|hellaswag|10": {
"acc": 0.6184027086237801,
"acc_stderr": 0.004847857546957477,
"acc_norm": 0.81557458673571,
"acc_norm_stderr": 0.0038703811999679567
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775635,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775635
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073845,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6605504587155964,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.6605504587155964,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510927,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510927
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914414,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6832694763729247,
"acc_stderr": 0.016635566427712564,
"acc_norm": 0.6832694763729247,
"acc_norm_stderr": 0.016635566427712564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574911,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574911
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840622,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.012463861839982061,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.012463861839982061
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4918300653594771,
"acc_stderr": 0.02022513434305726,
"acc_norm": 0.4918300653594771,
"acc_norm_stderr": 0.02022513434305726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348641,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348641
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.035087719298245626,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.035087719298245626
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49469156179831597,
"mc2_stderr": 0.015422264829446598
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Gryphe__MythoMix-L2-13b | 2023-09-22T22:23:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Gryphe/MythoMix-L2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoMix-L2-13b](https://huggingface.co/Gryphe/MythoMix-L2-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoMix-L2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T22:23:08.063250](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoMix-L2-13b/blob/main/results_2023-09-22T22-23-08.063250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12804110738255034,\n\
\ \"em_stderr\": 0.0034218610287585043,\n \"f1\": 0.19858850671140846,\n\
\ \"f1_stderr\": 0.0035721276185422235,\n \"acc\": 0.42692797214890377,\n\
\ \"acc_stderr\": 0.01016682217493381\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12804110738255034,\n \"em_stderr\": 0.0034218610287585043,\n\
\ \"f1\": 0.19858850671140846,\n \"f1_stderr\": 0.0035721276185422235\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683973\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183644\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoMix-L2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T22_23_08.063250
path:
- '**/details_harness|drop|3_2023-09-22T22-23-08.063250.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T22-23-08.063250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T22_23_08.063250
path:
- '**/details_harness|gsm8k|5_2023-09-22T22-23-08.063250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T22-23-08.063250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:38:13.191902.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:38:13.191902.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:38:13.191902.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T22_23_08.063250
path:
- '**/details_harness|winogrande|5_2023-09-22T22-23-08.063250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T22-23-08.063250.parquet'
- config_name: results
data_files:
- split: 2023_08_09T21_38_13.191902
path:
- results_2023-08-09T21:38:13.191902.parquet
- split: 2023_09_22T22_23_08.063250
path:
- results_2023-09-22T22-23-08.063250.parquet
- split: latest
path:
- results_2023-09-22T22-23-08.063250.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoMix-L2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoMix-L2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoMix-L2-13b](https://huggingface.co/Gryphe/MythoMix-L2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoMix-L2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T22:23:08.063250](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoMix-L2-13b/blob/main/results_2023-09-22T22-23-08.063250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12804110738255034,
"em_stderr": 0.0034218610287585043,
"f1": 0.19858850671140846,
"f1_stderr": 0.0035721276185422235,
"acc": 0.42692797214890377,
"acc_stderr": 0.01016682217493381
},
"harness|drop|3": {
"em": 0.12804110738255034,
"em_stderr": 0.0034218610287585043,
"f1": 0.19858850671140846,
"f1_stderr": 0.0035721276185422235
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683973
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_biology-neg-prepend-fix | 2023-08-21T07:32:12.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6719
num_examples: 5
- name: test
num_bytes: 417471
num_examples: 144
download_size: 14610
dataset_size: 424190
---
# Dataset Card for "mmlu-college_biology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b | 2023-09-23T12:37:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Gryphe/MythoLogic-L2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoLogic-L2-13b](https://huggingface.co/Gryphe/MythoLogic-L2-13b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T12:37:06.579153](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b/blob/main/results_2023-09-23T12-37-06.579153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2177013422818792,\n\
\ \"em_stderr\": 0.004226262781727102,\n \"f1\": 0.2842743288590614,\n\
\ \"f1_stderr\": 0.004232535857485872,\n \"acc\": 0.43918283744411857,\n\
\ \"acc_stderr\": 0.01042943655066695\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2177013422818792,\n \"em_stderr\": 0.004226262781727102,\n\
\ \"f1\": 0.2842743288590614,\n \"f1_stderr\": 0.004232535857485872\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \
\ \"acc_stderr\": 0.008870331256489993\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoLogic-L2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|drop|3_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T12-37-06.579153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|gsm8k|5_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T12-37-06.579153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|winogrande|5_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T12-37-06.579153.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- results_2023-08-09T11:05:11.641476.parquet
- split: 2023_09_23T12_37_06.579153
path:
- results_2023-09-23T12-37-06.579153.parquet
- split: latest
path:
- results_2023-09-23T12-37-06.579153.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoLogic-L2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoLogic-L2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoLogic-L2-13b](https://huggingface.co/Gryphe/MythoLogic-L2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T12:37:06.579153](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b/blob/main/results_2023-09-23T12-37-06.579153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2177013422818792,
"em_stderr": 0.004226262781727102,
"f1": 0.2842743288590614,
"f1_stderr": 0.004232535857485872,
"acc": 0.43918283744411857,
"acc_stderr": 0.01042943655066695
},
"harness|drop|3": {
"em": 0.2177013422818792,
"em_stderr": 0.004226262781727102,
"f1": 0.2842743288590614,
"f1_stderr": 0.004232535857485872
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489993
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_chemistry-neg-prepend-fix | 2023-08-21T07:32:25.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6203
num_examples: 5
- name: test
num_bytes: 253144
num_examples: 100
download_size: 14388
dataset_size: 259347
---
# Dataset Card for "mmlu-college_chemistry-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gryphe__MythoBoros-13b | 2023-08-27T12:38:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Gryphe/MythoBoros-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoBoros-13b](https://huggingface.co/Gryphe/MythoBoros-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoBoros-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T13:29:01.671194](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoBoros-13b/blob/main/results_2023-07-24T13%3A29%3A01.671194.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5042364929007946,\n\
\ \"acc_stderr\": 0.03494052149030365,\n \"acc_norm\": 0.5080708168077193,\n\
\ \"acc_norm_stderr\": 0.03492182316235082,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.4893444522172139,\n\
\ \"mc2_stderr\": 0.015451708408635087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n\
\ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.619398526190002,\n\
\ \"acc_stderr\": 0.00484542452476404,\n \"acc_norm\": 0.8174666401115316,\n\
\ \"acc_norm_stderr\": 0.0038549403270910316\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933886,\n\
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933886\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095933,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095933\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6825688073394496,\n \"acc_stderr\": 0.0199571521984605,\n \"acc_norm\"\
: 0.6825688073394496,\n \"acc_norm_stderr\": 0.0199571521984605\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.030998666304560534,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.030998666304560534\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n\
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.01632881442210205,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.01632881442210205\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952243,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369916,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369916\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n\
\ \"acc_stderr\": 0.02833327710956279,\n \"acc_norm\": 0.5337620578778135,\n\
\ \"acc_norm_stderr\": 0.02833327710956279\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759415,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759415\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136213,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136213\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49673202614379086,\n \"acc_stderr\": 0.020227402794434867,\n \
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.020227402794434867\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348641,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348641\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.4893444522172139,\n\
\ \"mc2_stderr\": 0.015451708408635087\n }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoBoros-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:29:01.671194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:29:01.671194.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:29:01.671194.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:29:01.671194.parquet'
- config_name: results
data_files:
- split: 2023_07_24T13_29_01.671194
path:
- results_2023-07-24T13:29:01.671194.parquet
- split: latest
path:
- results_2023-07-24T13:29:01.671194.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoBoros-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoBoros-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoBoros-13b](https://huggingface.co/Gryphe/MythoBoros-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoBoros-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T13:29:01.671194](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoBoros-13b/blob/main/results_2023-07-24T13%3A29%3A01.671194.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5042364929007946,
"acc_stderr": 0.03494052149030365,
"acc_norm": 0.5080708168077193,
"acc_norm_stderr": 0.03492182316235082,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.4893444522172139,
"mc2_stderr": 0.015451708408635087
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.5819112627986348,
"acc_norm_stderr": 0.014413988396996077
},
"harness|hellaswag|10": {
"acc": 0.619398526190002,
"acc_stderr": 0.00484542452476404,
"acc_norm": 0.8174666401115316,
"acc_norm_stderr": 0.0038549403270910316
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095933,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095933
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6825688073394496,
"acc_stderr": 0.0199571521984605,
"acc_norm": 0.6825688073394496,
"acc_norm_stderr": 0.0199571521984605
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560534,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560534
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.01632881442210205,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.01632881442210205
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952243,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369916,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369916
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5337620578778135,
"acc_stderr": 0.02833327710956279,
"acc_norm": 0.5337620578778135,
"acc_norm_stderr": 0.02833327710956279
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759415,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759415
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136213,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136213
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.020227402794434867,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.020227402794434867
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348641,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348641
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.4893444522172139,
"mc2_stderr": 0.015451708408635087
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf | 2023-08-27T12:38:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jzjiao/opt-1.3b-rlhf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jzjiao/opt-1.3b-rlhf](https://huggingface.co/jzjiao/opt-1.3b-rlhf) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T14:36:48.435460](https://huggingface.co/datasets/open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf/blob/main/results_2023-07-19T14%3A36%3A48.435460.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2567701403222773,\n\
\ \"acc_stderr\": 0.031499368570930275,\n \"acc_norm\": 0.2591794779216216,\n\
\ \"acc_norm_stderr\": 0.03150672587324535,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.37436160193856854,\n\
\ \"mc2_stderr\": 0.01443982811254708\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313366,\n\
\ \"acc_norm\": 0.28924914675767915,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4102768372834097,\n\
\ \"acc_stderr\": 0.004908786109095824,\n \"acc_norm\": 0.5276837283409679,\n\
\ \"acc_norm_stderr\": 0.0049821273156052115\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.038850042458002554,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.038850042458002554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.1513157894736842,\n \"acc_stderr\": 0.029162631596843968,\n\
\ \"acc_norm\": 0.1513157894736842,\n \"acc_norm_stderr\": 0.029162631596843968\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.02700876609070808,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.02700876609070808\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.0472400735238389,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.0472400735238389\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.17872340425531916,\n \"acc_stderr\": 0.02504537327205098,\n\
\ \"acc_norm\": 0.17872340425531916,\n \"acc_norm_stderr\": 0.02504537327205098\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23015873015873015,\n \"acc_stderr\": 0.021679219663693152,\n \"\
acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.021679219663693152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380558,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380558\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.01877605231961962,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.01877605231961962\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399811,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399811\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407266,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407266\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914394,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914394\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.01412596875467339,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.01412596875467339\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824792,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824792\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.22508038585209003,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.010885929742002205,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.010885929742002205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322246,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322246\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675592,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n\
\ \"acc_stderr\": 0.028748298931728658,\n \"acc_norm\": 0.208955223880597,\n\
\ \"acc_norm_stderr\": 0.028748298931728658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.37436160193856854,\n\
\ \"mc2_stderr\": 0.01443982811254708\n }\n}\n```"
repo_url: https://huggingface.co/jzjiao/opt-1.3b-rlhf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:36:48.435460.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- results_2023-07-19T14:36:48.435460.parquet
- split: latest
path:
- results_2023-07-19T14:36:48.435460.parquet
---
# Dataset Card for Evaluation run of jzjiao/opt-1.3b-rlhf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jzjiao/opt-1.3b-rlhf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jzjiao/opt-1.3b-rlhf](https://huggingface.co/jzjiao/opt-1.3b-rlhf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T14:36:48.435460](https://huggingface.co/datasets/open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf/blob/main/results_2023-07-19T14%3A36%3A48.435460.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2567701403222773,
"acc_stderr": 0.031499368570930275,
"acc_norm": 0.2591794779216216,
"acc_norm_stderr": 0.03150672587324535,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.37436160193856854,
"mc2_stderr": 0.01443982811254708
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313366,
"acc_norm": 0.28924914675767915,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.4102768372834097,
"acc_stderr": 0.004908786109095824,
"acc_norm": 0.5276837283409679,
"acc_norm_stderr": 0.0049821273156052115
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.038850042458002554,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.038850042458002554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.1513157894736842,
"acc_stderr": 0.029162631596843968,
"acc_norm": 0.1513157894736842,
"acc_norm_stderr": 0.029162631596843968
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.02700876609070808,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.02700876609070808
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.0472400735238389,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.0472400735238389
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.17872340425531916,
"acc_stderr": 0.02504537327205098,
"acc_norm": 0.17872340425531916,
"acc_norm_stderr": 0.02504537327205098
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.021679219663693152,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.021679219663693152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380558,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380558
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.01877605231961962,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.01877605231961962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399811,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399811
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407266,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407266
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914394,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914394
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.01412596875467339,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.01412596875467339
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824792,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824792
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002205,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322246,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322246
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.208955223880597,
"acc_stderr": 0.028748298931728658,
"acc_norm": 0.208955223880597,
"acc_norm_stderr": 0.028748298931728658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.37436160193856854,
"mc2_stderr": 0.01443982811254708
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_computer_science-neg-prepend-fix | 2023-08-21T07:32:39.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 11273
num_examples: 5
- name: test
num_bytes: 320200
num_examples: 100
download_size: 22563
dataset_size: 331473
---
# Dataset Card for "mmlu-college_computer_science-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kiddyz__testlm-1 | 2023-08-27T12:38:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Kiddyz/testlm-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kiddyz/testlm-1](https://huggingface.co/Kiddyz/testlm-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm-1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-16T12:53:22.897812](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1/blob/main/results_2023-08-16T12%3A53%3A22.897812.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5128834307003443,\n\
\ \"acc_stderr\": 0.03501260490290392,\n \"acc_norm\": 0.5166256154161327,\n\
\ \"acc_norm_stderr\": 0.03500071412093006,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056992,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.014575583922019669\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5705038836885082,\n\
\ \"acc_stderr\": 0.004939925958728884,\n \"acc_norm\": 0.758016331408086,\n\
\ \"acc_norm_stderr\": 0.004274091605308121\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750573,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750573\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347368,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347368\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412202,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412202\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399813,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399813\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29720670391061454,\n\
\ \"acc_stderr\": 0.015285313353641602,\n \"acc_norm\": 0.29720670391061454,\n\
\ \"acc_norm_stderr\": 0.015285313353641602\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3754889178617992,\n\
\ \"acc_stderr\": 0.012367945396728208,\n \"acc_norm\": 0.3754889178617992,\n\
\ \"acc_norm_stderr\": 0.012367945396728208\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150124,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n }\n}\n```"
repo_url: https://huggingface.co/Kiddyz/testlm-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:53:22.897812.parquet'
- config_name: results
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- results_2023-08-16T12:53:22.897812.parquet
- split: latest
path:
- results_2023-08-16T12:53:22.897812.parquet
---
# Dataset Card for Evaluation run of Kiddyz/testlm-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kiddyz/testlm-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kiddyz/testlm-1](https://huggingface.co/Kiddyz/testlm-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm-1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-16T12:53:22.897812](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1/blob/main/results_2023-08-16T12%3A53%3A22.897812.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5128834307003443,
"acc_stderr": 0.03501260490290392,
"acc_norm": 0.5166256154161327,
"acc_norm_stderr": 0.03500071412093006,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056992,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.014575583922019669
},
"harness|hellaswag|10": {
"acc": 0.5705038836885082,
"acc_stderr": 0.004939925958728884,
"acc_norm": 0.758016331408086,
"acc_norm_stderr": 0.004274091605308121
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750573,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750573
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347368,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347368
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412202,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412202
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399813,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399813
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.0162460870697014,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.0162460870697014
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29720670391061454,
"acc_stderr": 0.015285313353641602,
"acc_norm": 0.29720670391061454,
"acc_norm_stderr": 0.015285313353641602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3754889178617992,
"acc_stderr": 0.012367945396728208,
"acc_norm": 0.3754889178617992,
"acc_norm_stderr": 0.012367945396728208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150124,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_mathematics-neg-prepend-fix | 2023-08-21T07:32:49.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7619
num_examples: 5
- name: test
num_bytes: 286539
num_examples: 100
download_size: 18411
dataset_size: 294158
---
# Dataset Card for "mmlu-college_mathematics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kiddyz__testlm2 | 2023-08-27T12:38:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Kiddyz/testlm2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kiddyz/testlm2](https://huggingface.co/Kiddyz/testlm2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T13:24:31.212024](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm2/blob/main/results_2023-08-17T13%3A24%3A31.212024.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5159377671021478,\n\
\ \"acc_stderr\": 0.03487459607154835,\n \"acc_norm\": 0.519591954426014,\n\
\ \"acc_norm_stderr\": 0.03486307795106439,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4867717626441099,\n\
\ \"mc2_stderr\": 0.015479620334955967\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.014611305705056987,\n\
\ \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.0145853058400071\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5723959370643298,\n\
\ \"acc_stderr\": 0.004937199759947679,\n \"acc_norm\": 0.7564230233021311,\n\
\ \"acc_norm_stderr\": 0.004283630516444485\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n \
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n\
\ \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6880733944954128,\n \"acc_stderr\": 0.019862967976707245,\n \"\
acc_norm\": 0.6880733944954128,\n \"acc_norm_stderr\": 0.019862967976707245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693247,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693247\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.0387410285981808,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.0387410285981808\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.01622501794477097,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.01622501794477097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\
\ \"acc_stderr\": 0.015218109544410184,\n \"acc_norm\": 0.2927374301675978,\n\
\ \"acc_norm_stderr\": 0.015218109544410184\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607708,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3898305084745763,\n\
\ \"acc_stderr\": 0.012456386619082604,\n \"acc_norm\": 0.3898305084745763,\n\
\ \"acc_norm_stderr\": 0.012456386619082604\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734576,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5016339869281046,\n \"acc_stderr\": 0.020227726838150124,\n \
\ \"acc_norm\": 0.5016339869281046,\n \"acc_norm_stderr\": 0.020227726838150124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512697,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512697\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4867717626441099,\n\
\ \"mc2_stderr\": 0.015479620334955967\n }\n}\n```"
repo_url: https://huggingface.co/Kiddyz/testlm2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|arc:challenge|25_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hellaswag|10_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T13:24:31.212024.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:24:31.212024.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T13:24:31.212024.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T13:24:31.212024.parquet'
- config_name: results
data_files:
- split: 2023_08_17T13_24_31.212024
path:
- results_2023-08-17T13:24:31.212024.parquet
- split: latest
path:
- results_2023-08-17T13:24:31.212024.parquet
---
# Dataset Card for Evaluation run of Kiddyz/testlm2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kiddyz/testlm2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kiddyz/testlm2](https://huggingface.co/Kiddyz/testlm2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T13:24:31.212024](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm2/blob/main/results_2023-08-17T13%3A24%3A31.212024.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5159377671021478,
"acc_stderr": 0.03487459607154835,
"acc_norm": 0.519591954426014,
"acc_norm_stderr": 0.03486307795106439,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4867717626441099,
"mc2_stderr": 0.015479620334955967
},
"harness|arc:challenge|25": {
"acc": 0.49829351535836175,
"acc_stderr": 0.014611305705056987,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.0145853058400071
},
"harness|hellaswag|10": {
"acc": 0.5723959370643298,
"acc_stderr": 0.004937199759947679,
"acc_norm": 0.7564230233021311,
"acc_norm_stderr": 0.004283630516444485
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6880733944954128,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.6880733944954128,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693247,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693247
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.0387410285981808,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.0387410285981808
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477097,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410184,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410184
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607708,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3898305084745763,
"acc_stderr": 0.012456386619082604,
"acc_norm": 0.3898305084745763,
"acc_norm_stderr": 0.012456386619082604
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734576,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5016339869281046,
"acc_stderr": 0.020227726838150124,
"acc_norm": 0.5016339869281046,
"acc_norm_stderr": 0.020227726838150124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512697,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512697
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4867717626441099,
"mc2_stderr": 0.015479620334955967
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Kiddyz__testlm-1-1 | 2023-08-27T12:38:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Kiddyz/testlm-1-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kiddyz/testlm-1-1](https://huggingface.co/Kiddyz/testlm-1-1) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm-1-1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-16T10:49:10.911062](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1-1/blob/main/results_2023-08-16T10%3A49%3A10.911062.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5128834307003443,\n\
\ \"acc_stderr\": 0.03501260490290392,\n \"acc_norm\": 0.5166256154161327,\n\
\ \"acc_norm_stderr\": 0.03500071412093006,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056992,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.014575583922019669\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5705038836885082,\n\
\ \"acc_stderr\": 0.004939925958728884,\n \"acc_norm\": 0.758016331408086,\n\
\ \"acc_norm_stderr\": 0.004274091605308121\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750573,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750573\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347368,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347368\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412202,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412202\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399813,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399813\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29720670391061454,\n\
\ \"acc_stderr\": 0.015285313353641602,\n \"acc_norm\": 0.29720670391061454,\n\
\ \"acc_norm_stderr\": 0.015285313353641602\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3754889178617992,\n\
\ \"acc_stderr\": 0.012367945396728208,\n \"acc_norm\": 0.3754889178617992,\n\
\ \"acc_norm_stderr\": 0.012367945396728208\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150124,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n }\n}\n```"
repo_url: https://huggingface.co/Kiddyz/testlm-1-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|arc:challenge|25_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hellaswag|10_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:49:10.911062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:49:10.911062.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T10:49:10.911062.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T10:49:10.911062.parquet'
- config_name: results
data_files:
- split: 2023_08_16T10_49_10.911062
path:
- results_2023-08-16T10:49:10.911062.parquet
- split: latest
path:
- results_2023-08-16T10:49:10.911062.parquet
---
# Dataset Card for Evaluation run of Kiddyz/testlm-1-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kiddyz/testlm-1-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kiddyz/testlm-1-1](https://huggingface.co/Kiddyz/testlm-1-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm-1-1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-16T10:49:10.911062](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1-1/blob/main/results_2023-08-16T10%3A49%3A10.911062.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5128834307003443,
"acc_stderr": 0.03501260490290392,
"acc_norm": 0.5166256154161327,
"acc_norm_stderr": 0.03500071412093006,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056992,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.014575583922019669
},
"harness|hellaswag|10": {
"acc": 0.5705038836885082,
"acc_stderr": 0.004939925958728884,
"acc_norm": 0.758016331408086,
"acc_norm_stderr": 0.004274091605308121
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750573,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750573
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347368,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347368
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412202,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412202
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399813,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399813
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.0162460870697014,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.0162460870697014
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29720670391061454,
"acc_stderr": 0.015285313353641602,
"acc_norm": 0.29720670391061454,
"acc_norm_stderr": 0.015285313353641602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3754889178617992,
"acc_stderr": 0.012367945396728208,
"acc_norm": 0.3754889178617992,
"acc_norm_stderr": 0.012367945396728208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150124,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_medicine-neg-prepend-fix | 2023-08-21T07:33:02.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6685
num_examples: 5
- name: test
num_bytes: 601115
num_examples: 173
download_size: 16063
dataset_size: 607800
---
# Dataset Card for "mmlu-college_medicine-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B | 2023-08-27T12:39:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/Alpacino-SuperCOT-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/Alpacino-SuperCOT-13B](https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T14:16:23.975101](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B/blob/main/results_2023-07-18T14%3A16%3A23.975101.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4824009425755326,\n\
\ \"acc_stderr\": 0.034928265143777885,\n \"acc_norm\": 0.48642737338368913,\n\
\ \"acc_norm_stderr\": 0.03490901762587443,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.454194299460972,\n\
\ \"mc2_stderr\": 0.014407901135836145\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836355,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436178\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6177056363274248,\n\
\ \"acc_stderr\": 0.004849547819134479,\n \"acc_norm\": 0.8168691495717985,\n\
\ \"acc_norm_stderr\": 0.0038598330442309015\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014499,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04082482904638628,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04082482904638628\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n\
\ \"acc_stderr\": 0.023000086859068646,\n \"acc_norm\": 0.2751322751322751,\n\
\ \"acc_norm_stderr\": 0.023000086859068646\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n\
\ \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.5161290322580645,\n \"acc_stderr\": 0.028429203176724555,\n\
\ \"acc_norm\": 0.5161290322580645,\n \"acc_norm_stderr\": 0.028429203176724555\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.02512465352588513,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.02512465352588513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.020728368457638497,\n \"\
acc_norm\": 0.6275229357798165,\n \"acc_norm_stderr\": 0.020728368457638497\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456052,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247273,\n \"\
acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247273\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.0435644720266507,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.0435644720266507\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751128,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751128\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004243,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004243\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6922094508301405,\n\
\ \"acc_stderr\": 0.016506045045155633,\n \"acc_norm\": 0.6922094508301405,\n\
\ \"acc_norm_stderr\": 0.016506045045155633\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377913,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377913\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576073,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576073\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n\
\ \"acc_stderr\": 0.02833327710956279,\n \"acc_norm\": 0.5337620578778135,\n\
\ \"acc_norm_stderr\": 0.02833327710956279\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.02833801742861133,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.02833801742861133\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37809647979139505,\n\
\ \"acc_stderr\": 0.012384878406798095,\n \"acc_norm\": 0.37809647979139505,\n\
\ \"acc_norm_stderr\": 0.012384878406798095\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48856209150326796,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.48856209150326796,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.03198761546763126,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.03198761546763126\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.03320685889744324,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.03320685889744324\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.454194299460972,\n\
\ \"mc2_stderr\": 0.014407901135836145\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:16:23.975101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:16:23.975101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:16:23.975101.parquet'
- config_name: results
data_files:
- split: 2023_07_18T14_16_23.975101
path:
- results_2023-07-18T14:16:23.975101.parquet
- split: latest
path:
- results_2023-07-18T14:16:23.975101.parquet
---
# Dataset Card for Evaluation run of xzuyn/Alpacino-SuperCOT-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/Alpacino-SuperCOT-13B](https://huggingface.co/xzuyn/Alpacino-SuperCOT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T14:16:23.975101](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__Alpacino-SuperCOT-13B/blob/main/results_2023-07-18T14%3A16%3A23.975101.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4824009425755326,
"acc_stderr": 0.034928265143777885,
"acc_norm": 0.48642737338368913,
"acc_norm_stderr": 0.03490901762587443,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.454194299460972,
"mc2_stderr": 0.014407901135836145
},
"harness|arc:challenge|25": {
"acc": 0.5452218430034129,
"acc_stderr": 0.014551507060836355,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436178
},
"harness|hellaswag|10": {
"acc": 0.6177056363274248,
"acc_stderr": 0.004849547819134479,
"acc_norm": 0.8168691495717985,
"acc_norm_stderr": 0.0038598330442309015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014499,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.04082482904638628,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04082482904638628
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068646,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068646
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.020728368457638497,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.020728368457638497
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247273,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247273
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.0435644720266507,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.0435644720266507
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751128,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751128
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004243,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004243
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6922094508301405,
"acc_stderr": 0.016506045045155633,
"acc_norm": 0.6922094508301405,
"acc_norm_stderr": 0.016506045045155633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377913,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377913
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576073,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576073
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5337620578778135,
"acc_stderr": 0.02833327710956279,
"acc_norm": 0.5337620578778135,
"acc_norm_stderr": 0.02833327710956279
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37809647979139505,
"acc_stderr": 0.012384878406798095,
"acc_norm": 0.37809647979139505,
"acc_norm_stderr": 0.012384878406798095
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48856209150326796,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.48856209150326796,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.03198761546763126,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.03198761546763126
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.03320685889744324,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.03320685889744324
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.454194299460972,
"mc2_stderr": 0.014407901135836145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_xzuyn__MedicWizard-7B | 2023-08-27T12:39:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/MedicWizard-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/MedicWizard-7B](https://huggingface.co/xzuyn/MedicWizard-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__MedicWizard-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-18T11:48:49.794718](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__MedicWizard-7B/blob/main/results_2023-07-18T11%3A48%3A49.794718.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4498163922254517,\n\
\ \"acc_stderr\": 0.03531475159142684,\n \"acc_norm\": 0.45333413347549184,\n\
\ \"acc_norm_stderr\": 0.03530105541819273,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842897,\n \"mc2\": 0.4131669444750809,\n\
\ \"mc2_stderr\": 0.015388789815395057\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.014575583922019669\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6036646086436964,\n\
\ \"acc_stderr\": 0.004881359589148999,\n \"acc_norm\": 0.7839075881298546,\n\
\ \"acc_norm_stderr\": 0.004107368887208787\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.36551724137931035,\n \"acc_stderr\": 0.04013124195424385,\n\
\ \"acc_norm\": 0.36551724137931035,\n \"acc_norm_stderr\": 0.04013124195424385\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.45806451612903226,\n \"acc_stderr\": 0.028343787250540625,\n \"\
acc_norm\": 0.45806451612903226,\n \"acc_norm_stderr\": 0.028343787250540625\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552012,\n\
\ \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552012\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.03517739796373131,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.03517739796373131\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885127,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885127\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.02059808200993737,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.02059808200993737\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.029886910547626957,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.029886910547626957\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811224,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.04948637324026637,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.04948637324026637\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6282051282051282,\n\
\ \"acc_stderr\": 0.03166098891888078,\n \"acc_norm\": 0.6282051282051282,\n\
\ \"acc_norm_stderr\": 0.03166098891888078\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5913154533844189,\n\
\ \"acc_stderr\": 0.017579250148153387,\n \"acc_norm\": 0.5913154533844189,\n\
\ \"acc_norm_stderr\": 0.017579250148153387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369804,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369804\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.028121636040639886,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.028121636040639886\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.3428943937418514,\n \"acc_stderr\": 0.012123463271585895,\n\
\ \"acc_norm\": 0.3428943937418514,\n \"acc_norm_stderr\": 0.012123463271585895\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714867,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714867\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47058823529411764,\n \"acc_stderr\": 0.02019280827143379,\n \
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.02019280827143379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03136250240935892,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03136250240935892\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n\
\ \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842897,\n \"mc2\": 0.4131669444750809,\n\
\ \"mc2_stderr\": 0.015388789815395057\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/MedicWizard-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:48:49.794718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:48:49.794718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:48:49.794718.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_48_49.794718
path:
- results_2023-07-18T11:48:49.794718.parquet
- split: latest
path:
- results_2023-07-18T11:48:49.794718.parquet
---
# Dataset Card for Evaluation run of xzuyn/MedicWizard-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/MedicWizard-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/MedicWizard-7B](https://huggingface.co/xzuyn/MedicWizard-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__MedicWizard-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T11:48:49.794718](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__MedicWizard-7B/blob/main/results_2023-07-18T11%3A48%3A49.794718.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4498163922254517,
"acc_stderr": 0.03531475159142684,
"acc_norm": 0.45333413347549184,
"acc_norm_stderr": 0.03530105541819273,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842897,
"mc2": 0.4131669444750809,
"mc2_stderr": 0.015388789815395057
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.014575583922019669
},
"harness|hellaswag|10": {
"acc": 0.6036646086436964,
"acc_stderr": 0.004881359589148999,
"acc_norm": 0.7839075881298546,
"acc_norm_stderr": 0.004107368887208787
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.36551724137931035,
"acc_stderr": 0.04013124195424385,
"acc_norm": 0.36551724137931035,
"acc_norm_stderr": 0.04013124195424385
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.028343787250540625,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.028343787250540625
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.03517739796373131,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.03517739796373131
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.025124653525885127,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.025124653525885127
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.02059808200993737,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.02059808200993737
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.029886910547626957,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.029886910547626957
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811224,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.04948637324026637,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.04948637324026637
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.03166098891888078,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.03166098891888078
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5913154533844189,
"acc_stderr": 0.017579250148153387,
"acc_norm": 0.5913154533844189,
"acc_norm_stderr": 0.017579250148153387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369804,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369804
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028121636040639886,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028121636040639886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3428943937418514,
"acc_stderr": 0.012123463271585895,
"acc_norm": 0.3428943937418514,
"acc_norm_stderr": 0.012123463271585895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714867,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714867
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.02019280827143379,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.02019280827143379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.03136250240935892,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03136250240935892
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842897,
"mc2": 0.4131669444750809,
"mc2_stderr": 0.015388789815395057
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_physics-neg-prepend-fix | 2023-08-21T07:33:13.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7147
num_examples: 5
- name: test
num_bytes: 282713
num_examples: 102
download_size: 15541
dataset_size: 289860
---
# Dataset Card for "mmlu-college_physics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1 | 2023-08-27T12:39:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gywy/llama2-13b-chinese-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gywy/llama2-13b-chinese-v1](https://huggingface.co/gywy/llama2-13b-chinese-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-26T15:10:00.921624](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1/blob/main/results_2023-07-26T15%3A10%3A00.921624.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5420814148370803,\n\
\ \"acc_stderr\": 0.03472894201222865,\n \"acc_norm\": 0.5463875639849175,\n\
\ \"acc_norm_stderr\": 0.03471430598894899,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.45724154700953135,\n\
\ \"mc2_stderr\": 0.015310459215672905\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256513,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5381398127862975,\n\
\ \"acc_stderr\": 0.004975243508751998,\n \"acc_norm\": 0.7572196773551085,\n\
\ \"acc_norm_stderr\": 0.004278871104930374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415175,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848608,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848608\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.02812096650391442,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.02812096650391442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n\
\ \"acc_stderr\": 0.015959829933084025,\n \"acc_norm\": 0.7254150702426565,\n\
\ \"acc_norm_stderr\": 0.015959829933084025\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806636,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806636\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.016155910721341774,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.016155910721341774\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791455,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791455\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.02751374728437942,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.02751374728437942\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n\
\ \"acc_stderr\": 0.012496346982909553,\n \"acc_norm\": 0.3970013037809648,\n\
\ \"acc_norm_stderr\": 0.012496346982909553\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.45724154700953135,\n\
\ \"mc2_stderr\": 0.015310459215672905\n }\n}\n```"
repo_url: https://huggingface.co/gywy/llama2-13b-chinese-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|arc:challenge|25_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hellaswag|10_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T15:10:00.921624.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T15:10:00.921624.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T15:10:00.921624.parquet'
- config_name: results
data_files:
- split: 2023_07_26T15_10_00.921624
path:
- results_2023-07-26T15:10:00.921624.parquet
- split: latest
path:
- results_2023-07-26T15:10:00.921624.parquet
---
# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gywy/llama2-13b-chinese-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gywy/llama2-13b-chinese-v1](https://huggingface.co/gywy/llama2-13b-chinese-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-26T15:10:00.921624](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v1/blob/main/results_2023-07-26T15%3A10%3A00.921624.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5420814148370803,
"acc_stderr": 0.03472894201222865,
"acc_norm": 0.5463875639849175,
"acc_norm_stderr": 0.03471430598894899,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920612,
"mc2": 0.45724154700953135,
"mc2_stderr": 0.015310459215672905
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256513,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578278
},
"harness|hellaswag|10": {
"acc": 0.5381398127862975,
"acc_stderr": 0.004975243508751998,
"acc_norm": 0.7572196773551085,
"acc_norm_stderr": 0.004278871104930374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415175,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.042258754519696365,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.042258754519696365
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848608,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848608
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.02812096650391442,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.02812096650391442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084025,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084025
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806636,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806636
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341774,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341774
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.028275490156791455,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.028275490156791455
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.02751374728437942,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.02751374728437942
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909553,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909553
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920612,
"mc2": 0.45724154700953135,
"mc2_stderr": 0.015310459215672905
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-computer_security-neg-prepend-fix | 2023-08-21T07:33:26.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5063
num_examples: 5
- name: test
num_bytes: 229284
num_examples: 100
download_size: 13363
dataset_size: 234347
---
# Dataset Card for "mmlu-computer_security-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco | 2023-09-17T08:05:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Monero/Manticore-13b-Chat-Pyg-Guanaco](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:05:02.846180](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco/blob/main/results_2023-09-17T08-05-02.846180.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1636954697986577,\n\
\ \"em_stderr\": 0.00378913611358371,\n \"f1\": 0.25622378355704734,\n\
\ \"f1_stderr\": 0.003909791858313052,\n \"acc\": 0.412985669347219,\n\
\ \"acc_stderr\": 0.010037439004551042\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1636954697986577,\n \"em_stderr\": 0.00378913611358371,\n\
\ \"f1\": 0.25622378355704734,\n \"f1_stderr\": 0.003909791858313052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.007740044337103798\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998289\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_05_02.846180
path:
- '**/details_harness|drop|3_2023-09-17T08-05-02.846180.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-05-02.846180.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_05_02.846180
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-05-02.846180.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-05-02.846180.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:13.261313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:13.261313.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_05_02.846180
path:
- '**/details_harness|winogrande|5_2023-09-17T08-05-02.846180.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-05-02.846180.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_26_13.261313
path:
- results_2023-07-19T18:26:13.261313.parquet
- split: 2023_09_17T08_05_02.846180
path:
- results_2023-09-17T08-05-02.846180.parquet
- split: latest
path:
- results_2023-09-17T08-05-02.846180.parquet
---
# Dataset Card for Evaluation run of Monero/Manticore-13b-Chat-Pyg-Guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Monero/Manticore-13b-Chat-Pyg-Guanaco](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:05:02.846180](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__Manticore-13b-Chat-Pyg-Guanaco/blob/main/results_2023-09-17T08-05-02.846180.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1636954697986577,
"em_stderr": 0.00378913611358371,
"f1": 0.25622378355704734,
"f1_stderr": 0.003909791858313052,
"acc": 0.412985669347219,
"acc_stderr": 0.010037439004551042
},
"harness|drop|3": {
"em": 0.1636954697986577,
"em_stderr": 0.00378913611358371,
"f1": 0.25622378355704734,
"f1_stderr": 0.003909791858313052
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.007740044337103798
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998289
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-conceptual_physics-neg-prepend-fix | 2023-08-21T07:33:39.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4993
num_examples: 5
- name: test
num_bytes: 438778
num_examples: 235
download_size: 13083
dataset_size: 443771
---
# Dataset Card for "mmlu-conceptual_physics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b | 2023-08-27T12:39:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T22:17:39.123351](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b/blob/main/results_2023-07-19T22%3A17%3A39.123351.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5457220476047545,\n\
\ \"acc_stderr\": 0.03459283803166259,\n \"acc_norm\": 0.5494247584970194,\n\
\ \"acc_norm_stderr\": 0.034575946686524923,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245196,\n \"mc2\": 0.5594851543429306,\n\
\ \"mc2_stderr\": 0.016227878204646183\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.01444946427886881,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268445\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6027683728340968,\n\
\ \"acc_stderr\": 0.004883246579496668,\n \"acc_norm\": 0.799044015136427,\n\
\ \"acc_norm_stderr\": 0.003998962580974816\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934265,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934265\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.02766618207553964,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.02766618207553964\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315946,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315946\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n\
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7321100917431193,\n\
\ \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.7321100917431193,\n\
\ \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n\
\ \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520981,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520981\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.026564178111422632,\n\
\ \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.026564178111422632\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34301675977653634,\n\
\ \"acc_stderr\": 0.015876912673057752,\n \"acc_norm\": 0.34301675977653634,\n\
\ \"acc_norm_stderr\": 0.015876912673057752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.012564871542534349,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.012564871542534349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.020054269200726463,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.020054269200726463\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245196,\n \"mc2\": 0.5594851543429306,\n\
\ \"mc2_stderr\": 0.016227878204646183\n }\n}\n```"
repo_url: https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:17:39.123351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:17:39.123351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:17:39.123351.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_17_39.123351
path:
- results_2023-07-19T22:17:39.123351.parquet
- split: latest
path:
- results_2023-07-19T22:17:39.123351.parquet
---
# Dataset Card for Evaluation run of Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T22:17:39.123351](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-Uncensored-SuperCOT-StoryTelling-30b/blob/main/results_2023-07-19T22%3A17%3A39.123351.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5457220476047545,
"acc_stderr": 0.03459283803166259,
"acc_norm": 0.5494247584970194,
"acc_norm_stderr": 0.034575946686524923,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245196,
"mc2": 0.5594851543429306,
"mc2_stderr": 0.016227878204646183
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.01444946427886881,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268445
},
"harness|hellaswag|10": {
"acc": 0.6027683728340968,
"acc_stderr": 0.004883246579496668,
"acc_norm": 0.799044015136427,
"acc_norm_stderr": 0.003998962580974816
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.61,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934265,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934265
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.02766618207553964,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.02766618207553964
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315946,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315946
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7321100917431193,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.7321100917431193,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520981,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520981
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5809248554913294,
"acc_stderr": 0.026564178111422632,
"acc_norm": 0.5809248554913294,
"acc_norm_stderr": 0.026564178111422632
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34301675977653634,
"acc_stderr": 0.015876912673057752,
"acc_norm": 0.34301675977653634,
"acc_norm_stderr": 0.015876912673057752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534349,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.020054269200726463,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.020054269200726463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245196,
"mc2": 0.5594851543429306,
"mc2_stderr": 0.016227878204646183
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b | 2023-09-16T20:24:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b](https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T20:24:34.064678](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b/blob/main/results_2023-09-16T20-24-34.064678.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23898909395973153,\n\
\ \"em_stderr\": 0.004367411698321815,\n \"f1\": 0.33218645134228264,\n\
\ \"f1_stderr\": 0.0042948501285767545,\n \"acc\": 0.37893683059743066,\n\
\ \"acc_stderr\": 0.008783513808235714\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.23898909395973153,\n \"em_stderr\": 0.004367411698321815,\n\
\ \"f1\": 0.33218645134228264,\n \"f1_stderr\": 0.0042948501285767545\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \
\ \"acc_stderr\": 0.005000212600773271\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.012566815015698158\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T20_24_34.064678
path:
- '**/details_harness|drop|3_2023-09-16T20-24-34.064678.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T20-24-34.064678.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T20_24_34.064678
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-24-34.064678.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-24-34.064678.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:53:40.714431.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:53:40.714431.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T20_24_34.064678
path:
- '**/details_harness|winogrande|5_2023-09-16T20-24-34.064678.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T20-24-34.064678.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_53_40.714431
path:
- results_2023-07-19T22:53:40.714431.parquet
- split: 2023_09_16T20_24_34.064678
path:
- results_2023-09-16T20-24-34.064678.parquet
- split: latest
path:
- results_2023-09-16T20-24-34.064678.parquet
---
# Dataset Card for Evaluation run of Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b](https://huggingface.co/Monero/WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T20:24:34.064678](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-30B-Uncensored-Guanaco-SuperCOT-30b/blob/main/results_2023-09-16T20-24-34.064678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23898909395973153,
"em_stderr": 0.004367411698321815,
"f1": 0.33218645134228264,
"f1_stderr": 0.0042948501285767545,
"acc": 0.37893683059743066,
"acc_stderr": 0.008783513808235714
},
"harness|drop|3": {
"em": 0.23898909395973153,
"em_stderr": 0.004367411698321815,
"f1": 0.33218645134228264,
"f1_stderr": 0.0042948501285767545
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.005000212600773271
},
"harness|winogrande|5": {
"acc": 0.7237569060773481,
"acc_stderr": 0.012566815015698158
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-econometrics-neg-prepend-fix | 2023-08-21T07:33:51.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7799
num_examples: 5
- name: test
num_bytes: 374298
num_examples: 114
download_size: 15514
dataset_size: 382097
---
# Dataset Card for "mmlu-econometrics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored | 2023-08-27T12:39:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Monero/WizardLM-13b-OpenAssistant-Uncensored](https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T13:19:46.120790](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored/blob/main/results_2023-07-24T13%3A19%3A46.120790.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43447909613148955,\n\
\ \"acc_stderr\": 0.0351971066740345,\n \"acc_norm\": 0.4379537574852551,\n\
\ \"acc_norm_stderr\": 0.03518588279234739,\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.01649740238201206,\n \"mc2\": 0.4940407873488723,\n\
\ \"mc2_stderr\": 0.01598359583562834\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.014586776355294321,\n\
\ \"acc_norm\": 0.4854948805460751,\n \"acc_norm_stderr\": 0.014605241081370053\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.569806811392153,\n\
\ \"acc_stderr\": 0.004940911779273365,\n \"acc_norm\": 0.7603067118103963,\n\
\ \"acc_norm_stderr\": 0.004260238033657913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.030635627957961823,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.030635627957961823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.0370385119309952,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.0370385119309952\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4161290322580645,\n\
\ \"acc_stderr\": 0.028040981380761543,\n \"acc_norm\": 0.4161290322580645,\n\
\ \"acc_norm_stderr\": 0.028040981380761543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.03178529710642749,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.03178529710642749\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374449,\n\
\ \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374449\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.035594435655639176,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.035594435655639176\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.03517739796373131,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.03517739796373131\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3974358974358974,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.3974358974358974,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5394495412844037,\n \"acc_stderr\": 0.021370494609995093,\n \"\
acc_norm\": 0.5394495412844037,\n \"acc_norm_stderr\": 0.021370494609995093\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953178,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953178\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380761,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380761\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5738396624472574,\n \"acc_stderr\": 0.032190357031317736,\n \
\ \"acc_norm\": 0.5738396624472574,\n \"acc_norm_stderr\": 0.032190357031317736\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.03327283370271344,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.03327283370271344\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.43558282208588955,\n \"acc_stderr\": 0.03895632464138936,\n\
\ \"acc_norm\": 0.43558282208588955,\n \"acc_norm_stderr\": 0.03895632464138936\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.03046365674734026,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.03046365674734026\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6091954022988506,\n\
\ \"acc_stderr\": 0.017448366067062526,\n \"acc_norm\": 0.6091954022988506,\n\
\ \"acc_norm_stderr\": 0.017448366067062526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.026636539741116072,\n\
\ \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.026636539741116072\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\
\ \"acc_stderr\": 0.014796502622562551,\n \"acc_norm\": 0.2670391061452514,\n\
\ \"acc_norm_stderr\": 0.014796502622562551\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.02843109544417664,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.02843109544417664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.47266881028938906,\n\
\ \"acc_stderr\": 0.02835563356832818,\n \"acc_norm\": 0.47266881028938906,\n\
\ \"acc_norm_stderr\": 0.02835563356832818\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.02775653525734767,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.02775653525734767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n\
\ \"acc_stderr\": 0.012078563777145572,\n \"acc_norm\": 0.3376792698826597,\n\
\ \"acc_norm_stderr\": 0.012078563777145572\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.035351400842767194,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.035351400842767194\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.0378913442461155,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.0378913442461155\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.01649740238201206,\n \"mc2\": 0.4940407873488723,\n\
\ \"mc2_stderr\": 0.01598359583562834\n }\n}\n```"
repo_url: https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:19:46.120790.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:19:46.120790.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:19:46.120790.parquet'
- config_name: results
data_files:
- split: 2023_07_24T13_19_46.120790
path:
- results_2023-07-24T13:19:46.120790.parquet
- split: latest
path:
- results_2023-07-24T13:19:46.120790.parquet
---
# Dataset Card for Evaluation run of Monero/WizardLM-13b-OpenAssistant-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Monero/WizardLM-13b-OpenAssistant-Uncensored](https://huggingface.co/Monero/WizardLM-13b-OpenAssistant-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T13:19:46.120790](https://huggingface.co/datasets/open-llm-leaderboard/details_Monero__WizardLM-13b-OpenAssistant-Uncensored/blob/main/results_2023-07-24T13%3A19%3A46.120790.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43447909613148955,
"acc_stderr": 0.0351971066740345,
"acc_norm": 0.4379537574852551,
"acc_norm_stderr": 0.03518588279234739,
"mc1": 0.3329253365973072,
"mc1_stderr": 0.01649740238201206,
"mc2": 0.4940407873488723,
"mc2_stderr": 0.01598359583562834
},
"harness|arc:challenge|25": {
"acc": 0.4709897610921502,
"acc_stderr": 0.014586776355294321,
"acc_norm": 0.4854948805460751,
"acc_norm_stderr": 0.014605241081370053
},
"harness|hellaswag|10": {
"acc": 0.569806811392153,
"acc_stderr": 0.004940911779273365,
"acc_norm": 0.7603067118103963,
"acc_norm_stderr": 0.004260238033657913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.4,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.0370385119309952,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.0370385119309952
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761543,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.03178529710642749,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.03178529710642749
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374449,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374449
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.035594435655639176,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.035594435655639176
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.03517739796373131,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.03517739796373131
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5394495412844037,
"acc_stderr": 0.021370494609995093,
"acc_norm": 0.5394495412844037,
"acc_norm_stderr": 0.021370494609995093
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953178,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953178
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380761,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380761
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5738396624472574,
"acc_stderr": 0.032190357031317736,
"acc_norm": 0.5738396624472574,
"acc_norm_stderr": 0.032190357031317736
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.03327283370271344,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.03327283370271344
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.43558282208588955,
"acc_stderr": 0.03895632464138936,
"acc_norm": 0.43558282208588955,
"acc_norm_stderr": 0.03895632464138936
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.03046365674734026,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.03046365674734026
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6091954022988506,
"acc_stderr": 0.017448366067062526,
"acc_norm": 0.6091954022988506,
"acc_norm_stderr": 0.017448366067062526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.026636539741116072,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.026636539741116072
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.014796502622562551,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.014796502622562551
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.02843109544417664,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.02843109544417664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.47266881028938906,
"acc_stderr": 0.02835563356832818,
"acc_norm": 0.47266881028938906,
"acc_norm_stderr": 0.02835563356832818
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.02775653525734767,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.02775653525734767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145572,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145572
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.035351400842767194,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.035351400842767194
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3329253365973072,
"mc1_stderr": 0.01649740238201206,
"mc2": 0.4940407873488723,
"mc2_stderr": 0.01598359583562834
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-electrical_engineering-neg-prepend-fix | 2023-08-21T07:34:03.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5473
num_examples: 5
- name: test
num_bytes: 275445
num_examples: 145
download_size: 13670
dataset_size: 280918
---
# Dataset Card for "mmlu-electrical_engineering-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard | 2023-08-27T12:39:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard](https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T19:43:56.163640](https://huggingface.co/datasets/open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard/blob/main/results_2023-07-19T19%3A43%3A56.163640.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3383082942783733,\n\
\ \"acc_stderr\": 0.034038904937501814,\n \"acc_norm\": 0.3424207667888371,\n\
\ \"acc_norm_stderr\": 0.03402640930744709,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.4327576136566873,\n\
\ \"mc2_stderr\": 0.015062768361653264\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866984,\n\
\ \"acc_norm\": 0.5127986348122867,\n \"acc_norm_stderr\": 0.014606603181012538\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5763792073292173,\n\
\ \"acc_stderr\": 0.004931219148182242,\n \"acc_norm\": 0.7746464847639912,\n\
\ \"acc_norm_stderr\": 0.0041696102548079705\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.03514942551267437,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.03514942551267437\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415426,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415426\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3387096774193548,\n \"acc_stderr\": 0.026923446059302834,\n \"\
acc_norm\": 0.3387096774193548,\n \"acc_norm_stderr\": 0.026923446059302834\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"\
acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.42487046632124353,\n \"acc_stderr\": 0.0356747133521254,\n\
\ \"acc_norm\": 0.42487046632124353,\n \"acc_norm_stderr\": 0.0356747133521254\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.02350757902064535,\n \
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.02350757902064535\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02959732973097809,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02959732973097809\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3577981651376147,\n \"acc_stderr\": 0.020552060784827814,\n \"\
acc_norm\": 0.3577981651376147,\n \"acc_norm_stderr\": 0.020552060784827814\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.03070137211151094,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.03070137211151094\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37254901960784315,\n \"acc_stderr\": 0.03393388584958404,\n \"\
acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.03393388584958404\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.35185185185185186,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.35185185185185186,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4188034188034188,\n\
\ \"acc_stderr\": 0.03232128912157791,\n \"acc_norm\": 0.4188034188034188,\n\
\ \"acc_norm_stderr\": 0.03232128912157791\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4125159642401022,\n\
\ \"acc_stderr\": 0.01760414910867193,\n \"acc_norm\": 0.4125159642401022,\n\
\ \"acc_norm_stderr\": 0.01760414910867193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3439306358381503,\n \"acc_stderr\": 0.025574123786546648,\n\
\ \"acc_norm\": 0.3439306358381503,\n \"acc_norm_stderr\": 0.025574123786546648\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02699254433929725,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02699254433929725\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n\
\ \"acc_stderr\": 0.026858825879488547,\n \"acc_norm\": 0.33762057877813506,\n\
\ \"acc_norm_stderr\": 0.026858825879488547\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.35802469135802467,\n \"acc_stderr\": 0.026675611926037093,\n\
\ \"acc_norm\": 0.35802469135802467,\n \"acc_norm_stderr\": 0.026675611926037093\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3089960886571056,\n\
\ \"acc_stderr\": 0.011801729777239249,\n \"acc_norm\": 0.3089960886571056,\n\
\ \"acc_norm_stderr\": 0.011801729777239249\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3284313725490196,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n\
\ \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.4327576136566873,\n\
\ \"mc2_stderr\": 0.015062768361653264\n }\n}\n```"
repo_url: https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:43:56.163640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:43:56.163640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:43:56.163640.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_43_56.163640
path:
- results_2023-07-19T19:43:56.163640.parquet
- split: latest
path:
- results_2023-07-19T19:43:56.163640.parquet
---
# Dataset Card for Evaluation run of wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard](https://huggingface.co/wannaphong/openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T19:43:56.163640](https://huggingface.co/datasets/open-llm-leaderboard/details_wannaphong__openthaigpt-0.1.0-beta-full-model_for_open_llm_leaderboard/blob/main/results_2023-07-19T19%3A43%3A56.163640.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3383082942783733,
"acc_stderr": 0.034038904937501814,
"acc_norm": 0.3424207667888371,
"acc_norm_stderr": 0.03402640930744709,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.4327576136566873,
"mc2_stderr": 0.015062768361653264
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866984,
"acc_norm": 0.5127986348122867,
"acc_norm_stderr": 0.014606603181012538
},
"harness|hellaswag|10": {
"acc": 0.5763792073292173,
"acc_stderr": 0.004931219148182242,
"acc_norm": 0.7746464847639912,
"acc_norm_stderr": 0.0041696102548079705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.03514942551267437,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.03514942551267437
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415426,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415426
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3387096774193548,
"acc_stderr": 0.026923446059302834,
"acc_norm": 0.3387096774193548,
"acc_norm_stderr": 0.026923446059302834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41919191919191917,
"acc_stderr": 0.035155207286704175,
"acc_norm": 0.41919191919191917,
"acc_norm_stderr": 0.035155207286704175
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.42487046632124353,
"acc_stderr": 0.0356747133521254,
"acc_norm": 0.42487046632124353,
"acc_norm_stderr": 0.0356747133521254
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.02350757902064535,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.02350757902064535
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603854
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3577981651376147,
"acc_stderr": 0.020552060784827814,
"acc_norm": 0.3577981651376147,
"acc_norm_stderr": 0.020552060784827814
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.03070137211151094,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.03070137211151094
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.03393388584958404,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.03393388584958404
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4188034188034188,
"acc_stderr": 0.03232128912157791,
"acc_norm": 0.4188034188034188,
"acc_norm_stderr": 0.03232128912157791
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4125159642401022,
"acc_stderr": 0.01760414910867193,
"acc_norm": 0.4125159642401022,
"acc_norm_stderr": 0.01760414910867193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3439306358381503,
"acc_stderr": 0.025574123786546648,
"acc_norm": 0.3439306358381503,
"acc_norm_stderr": 0.025574123786546648
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02699254433929725,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02699254433929725
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33762057877813506,
"acc_stderr": 0.026858825879488547,
"acc_norm": 0.33762057877813506,
"acc_norm_stderr": 0.026858825879488547
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.35802469135802467,
"acc_stderr": 0.026675611926037093,
"acc_norm": 0.35802469135802467,
"acc_norm_stderr": 0.026675611926037093
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3089960886571056,
"acc_stderr": 0.011801729777239249,
"acc_norm": 0.3089960886571056,
"acc_norm_stderr": 0.011801729777239249
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.4,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4228855721393035,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.4228855721393035,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38011695906432746,
"acc_stderr": 0.037229657413855394,
"acc_norm": 0.38011695906432746,
"acc_norm_stderr": 0.037229657413855394
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.4327576136566873,
"mc2_stderr": 0.015062768361653264
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_frank098__WizardLM_13B_juniper | 2023-08-27T12:39:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of frank098/WizardLM_13B_juniper
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frank098/WizardLM_13B_juniper](https://huggingface.co/frank098/WizardLM_13B_juniper)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frank098__WizardLM_13B_juniper\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T12:54:22.349435](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__WizardLM_13B_juniper/blob/main/results_2023-07-24T12%3A54%3A22.349435.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45808625384350504,\n\
\ \"acc_stderr\": 0.03525693152630926,\n \"acc_norm\": 0.4616346812749258,\n\
\ \"acc_norm_stderr\": 0.03524363986169768,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5149590028669258,\n\
\ \"mc2_stderr\": 0.01603000361270752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.014572000527756989,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5796654052977495,\n\
\ \"acc_stderr\": 0.004926038197714513,\n \"acc_norm\": 0.7719577773351922,\n\
\ \"acc_norm_stderr\": 0.004187124964848515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296559,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296559\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373056,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.535483870967742,\n \"acc_stderr\": 0.028372287797962935,\n \"\
acc_norm\": 0.535483870967742,\n \"acc_norm_stderr\": 0.028372287797962935\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.03517739796373132,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.03517739796373132\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.02521731518484648,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.02521731518484648\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6495412844036698,\n \"acc_stderr\": 0.02045607759982446,\n \"\
acc_norm\": 0.6495412844036698,\n \"acc_norm_stderr\": 0.02045607759982446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.03214814630240369,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.03214814630240369\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.03355746535223264,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.03355746535223264\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.03919415545048411,\n\
\ \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.03919415545048411\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467763,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467763\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6564495530012772,\n\
\ \"acc_stderr\": 0.016982145632652466,\n \"acc_norm\": 0.6564495530012772,\n\
\ \"acc_norm_stderr\": 0.016982145632652466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.02691864538323901,\n\
\ \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.02691864538323901\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225632,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225632\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.02856869975222587,\n\
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.02856869975222587\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n\
\ \"acc_stderr\": 0.028333277109562797,\n \"acc_norm\": 0.4662379421221865,\n\
\ \"acc_norm_stderr\": 0.028333277109562797\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327242,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327242\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3578878748370274,\n\
\ \"acc_stderr\": 0.012243563850490314,\n \"acc_norm\": 0.3578878748370274,\n\
\ \"acc_norm_stderr\": 0.012243563850490314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45098039215686275,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.03645981377388806,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.03645981377388806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5149590028669258,\n\
\ \"mc2_stderr\": 0.01603000361270752\n }\n}\n```"
repo_url: https://huggingface.co/frank098/WizardLM_13B_juniper
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|arc:challenge|25_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hellaswag|10_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:54:22.349435.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T12:54:22.349435.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T12:54:22.349435.parquet'
- config_name: results
data_files:
- split: 2023_07_24T12_54_22.349435
path:
- results_2023-07-24T12:54:22.349435.parquet
- split: latest
path:
- results_2023-07-24T12:54:22.349435.parquet
---
# Dataset Card for Evaluation run of frank098/WizardLM_13B_juniper
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/frank098/WizardLM_13B_juniper
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [frank098/WizardLM_13B_juniper](https://huggingface.co/frank098/WizardLM_13B_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frank098__WizardLM_13B_juniper",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T12:54:22.349435](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__WizardLM_13B_juniper/blob/main/results_2023-07-24T12%3A54%3A22.349435.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45808625384350504,
"acc_stderr": 0.03525693152630926,
"acc_norm": 0.4616346812749258,
"acc_norm_stderr": 0.03524363986169768,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5149590028669258,
"mc2_stderr": 0.01603000361270752
},
"harness|arc:challenge|25": {
"acc": 0.5366894197952219,
"acc_stderr": 0.014572000527756989,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5796654052977495,
"acc_stderr": 0.004926038197714513,
"acc_norm": 0.7719577773351922,
"acc_norm_stderr": 0.004187124964848515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296559,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296559
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373056,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962935,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.03517739796373132,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.03517739796373132
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.02521731518484648,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.02521731518484648
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6495412844036698,
"acc_stderr": 0.02045607759982446,
"acc_norm": 0.6495412844036698,
"acc_norm_stderr": 0.02045607759982446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.03214814630240369,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.03214814630240369
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.03355746535223264,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.03355746535223264
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.03919415545048411,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.03919415545048411
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467763,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467763
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6564495530012772,
"acc_stderr": 0.016982145632652466,
"acc_norm": 0.6564495530012772,
"acc_norm_stderr": 0.016982145632652466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225632,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.02856869975222587,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.02856869975222587
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562797,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562797
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327242,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327242
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3578878748370274,
"acc_stderr": 0.012243563850490314,
"acc_norm": 0.3578878748370274,
"acc_norm_stderr": 0.012243563850490314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.03645981377388806,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.03645981377388806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5149590028669258,
"mc2_stderr": 0.01603000361270752
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-elementary_mathematics-neg-prepend-fix | 2023-08-21T07:34:16.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6267
num_examples: 5
- name: test
num_bytes: 854641
num_examples: 378
download_size: 14034
dataset_size: 860908
---
# Dataset Card for "mmlu-elementary_mathematics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_frank098__orca_mini_3b_juniper | 2023-09-17T00:19:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of frank098/orca_mini_3b_juniper
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frank098/orca_mini_3b_juniper](https://huggingface.co/frank098/orca_mini_3b_juniper)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frank098__orca_mini_3b_juniper\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T00:19:44.475095](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__orca_mini_3b_juniper/blob/main/results_2023-09-17T00-19-44.475095.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.000277361445733574,\n \"f1\": 0.04966652684563771,\n\
\ \"f1_stderr\": 0.001261898789421576,\n \"acc\": 0.3041531307650375,\n\
\ \"acc_stderr\": 0.007876199120377373\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.000277361445733574,\n\
\ \"f1\": 0.04966652684563771,\n \"f1_stderr\": 0.001261898789421576\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948044\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6029992107340174,\n \"acc_stderr\": 0.013751092519806702\n\
\ }\n}\n```"
repo_url: https://huggingface.co/frank098/orca_mini_3b_juniper
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T00_19_44.475095
path:
- '**/details_harness|drop|3_2023-09-17T00-19-44.475095.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T00-19-44.475095.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T00_19_44.475095
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-19-44.475095.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-19-44.475095.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:27:47.193085.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:27:47.193085.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T00_19_44.475095
path:
- '**/details_harness|winogrande|5_2023-09-17T00-19-44.475095.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T00-19-44.475095.parquet'
- config_name: results
data_files:
- split: 2023_07_24T10_27_47.193085
path:
- results_2023-07-24T10:27:47.193085.parquet
- split: 2023_09_17T00_19_44.475095
path:
- results_2023-09-17T00-19-44.475095.parquet
- split: latest
path:
- results_2023-09-17T00-19-44.475095.parquet
---
# Dataset Card for Evaluation run of frank098/orca_mini_3b_juniper
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/frank098/orca_mini_3b_juniper
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [frank098/orca_mini_3b_juniper](https://huggingface.co/frank098/orca_mini_3b_juniper) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frank098__orca_mini_3b_juniper",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:19:44.475095](https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__orca_mini_3b_juniper/blob/main/results_2023-09-17T00-19-44.475095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.000277361445733574,
"f1": 0.04966652684563771,
"f1_stderr": 0.001261898789421576,
"acc": 0.3041531307650375,
"acc_stderr": 0.007876199120377373
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.000277361445733574,
"f1": 0.04966652684563771,
"f1_stderr": 0.001261898789421576
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948044
},
"harness|winogrande|5": {
"acc": 0.6029992107340174,
"acc_stderr": 0.013751092519806702
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized | 2023-09-18T06:09:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FabbriSimo01/Facebook_opt_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T06:09:22.891569](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized/blob/main/results_2023-09-18T06-09-22.891569.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.00046850650303682405,\n \"f1\": 0.05110318791946325,\n\
\ \"f1_stderr\": 0.0012507542097710141,\n \"acc\": 0.29910069155018665,\n\
\ \"acc_stderr\": 0.007429518317222754\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303682405,\n\
\ \"f1\": 0.05110318791946325,\n \"f1_stderr\": 0.0012507542097710141\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492619\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896245\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T06_09_22.891569
path:
- '**/details_harness|drop|3_2023-09-18T06-09-22.891569.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T06-09-22.891569.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T06_09_22.891569
path:
- '**/details_harness|gsm8k|5_2023-09-18T06-09-22.891569.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T06-09-22.891569.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:58:20.478747.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:58:20.478747.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T06_09_22.891569
path:
- '**/details_harness|winogrande|5_2023-09-18T06-09-22.891569.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T06-09-22.891569.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_58_20.478747
path:
- results_2023-07-19T14:58:20.478747.parquet
- split: 2023_09_18T06_09_22.891569
path:
- results_2023-09-18T06-09-22.891569.parquet
- split: latest
path:
- results_2023-09-18T06-09-22.891569.parquet
---
# Dataset Card for Evaluation run of FabbriSimo01/Facebook_opt_1.3b_Quantized
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FabbriSimo01/Facebook_opt_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Facebook_opt_1.3b_Quantized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T06:09:22.891569](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Facebook_opt_1.3b_Quantized/blob/main/results_2023-09-18T06-09-22.891569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303682405,
"f1": 0.05110318791946325,
"f1_stderr": 0.0012507542097710141,
"acc": 0.29910069155018665,
"acc_stderr": 0.007429518317222754
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303682405,
"f1": 0.05110318791946325,
"f1_stderr": 0.0012507542097710141
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492619
},
"harness|winogrande|5": {
"acc": 0.5966850828729282,
"acc_stderr": 0.013787257285896245
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-formal_logic-neg-prepend-fix | 2023-08-21T07:34:27.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7320
num_examples: 5
- name: test
num_bytes: 426415
num_examples: 126
download_size: 16460
dataset_size: 433735
---
# Dataset Card for "mmlu-formal_logic-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_arver__llama7b-qlora | 2023-09-17T10:28:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of arver/llama7b-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arver/llama7b-qlora](https://huggingface.co/arver/llama7b-qlora) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arver__llama7b-qlora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T10:28:15.239885](https://huggingface.co/datasets/open-llm-leaderboard/details_arver__llama7b-qlora/blob/main/results_2023-09-17T10-28-15.239885.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298541,\n \"f1\": 0.06146078020134238,\n\
\ \"f1_stderr\": 0.0013862861484435665,\n \"acc\": 0.37858887140948305,\n\
\ \"acc_stderr\": 0.008690432281689055\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298541,\n\
\ \"f1\": 0.06146078020134238,\n \"f1_stderr\": 0.0013862861484435665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \
\ \"acc_stderr\": 0.004836348558260928\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117185\n\
\ }\n}\n```"
repo_url: https://huggingface.co/arver/llama7b-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T10_28_15.239885
path:
- '**/details_harness|drop|3_2023-09-17T10-28-15.239885.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T10-28-15.239885.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T10_28_15.239885
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-28-15.239885.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-28-15.239885.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:44:33.087537.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:44:33.087537.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T10_28_15.239885
path:
- '**/details_harness|winogrande|5_2023-09-17T10-28-15.239885.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T10-28-15.239885.parquet'
- config_name: results
data_files:
- split: 2023_08_09T19_44_33.087537
path:
- results_2023-08-09T19:44:33.087537.parquet
- split: 2023_09_17T10_28_15.239885
path:
- results_2023-09-17T10-28-15.239885.parquet
- split: latest
path:
- results_2023-09-17T10-28-15.239885.parquet
---
# Dataset Card for Evaluation run of arver/llama7b-qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/arver/llama7b-qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [arver/llama7b-qlora](https://huggingface.co/arver/llama7b-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arver__llama7b-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T10:28:15.239885](https://huggingface.co/datasets/open-llm-leaderboard/details_arver__llama7b-qlora/blob/main/results_2023-09-17T10-28-15.239885.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298541,
"f1": 0.06146078020134238,
"f1_stderr": 0.0013862861484435665,
"acc": 0.37858887140948305,
"acc_stderr": 0.008690432281689055
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298541,
"f1": 0.06146078020134238,
"f1_stderr": 0.0013862861484435665
},
"harness|gsm8k|5": {
"acc": 0.03184230477634572,
"acc_stderr": 0.004836348558260928
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-global_facts-neg-prepend-fix | 2023-08-21T07:34:40.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5659
num_examples: 5
- name: test
num_bytes: 225313
num_examples: 100
download_size: 13046
dataset_size: 230972
---
# Dataset Card for "mmlu-global_facts-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt | 2023-08-27T12:39:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/pythia-1.4b-deduped-sharegpt](https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-31T09:37:34.765508](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-07-31T09%3A37%3A34.765508.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24390853423323458,\n\
\ \"acc_stderr\": 0.030959502210233317,\n \"acc_norm\": 0.24687570133255124,\n\
\ \"acc_norm_stderr\": 0.03096870063976746,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.4180996357014574,\n\
\ \"mc2_stderr\": 0.01449797252378389\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29948805460750855,\n \"acc_stderr\": 0.013385021637313565,\n\
\ \"acc_norm\": 0.3430034129692833,\n \"acc_norm_stderr\": 0.013872423223718166\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4133638717386975,\n\
\ \"acc_stderr\": 0.0049143057985756985,\n \"acc_norm\": 0.5449113722366062,\n\
\ \"acc_norm_stderr\": 0.004969611554685393\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313141,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313141\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.14473684210526316,\n \"acc_stderr\": 0.0286319518459304,\n\
\ \"acc_norm\": 0.14473684210526316,\n \"acc_norm_stderr\": 0.0286319518459304\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.22486772486772486,\n \"acc_stderr\": 0.02150209607822914,\n \"\
acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.02150209607822914\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.02627308604753542,\n\
\ \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.02627308604753542\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429914,\n \"\
acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429914\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.021685546665333205,\n\
\ \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.021685546665333205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.01754937638931369,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.01754937638931369\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494033,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494033\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\
\ \"acc_stderr\": 0.015302380123542085,\n \"acc_norm\": 0.2413793103448276,\n\
\ \"acc_norm_stderr\": 0.015302380123542085\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468334,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468334\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2090032154340836,\n\
\ \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.2090032154340836,\n\
\ \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.20921985815602837,\n \"acc_stderr\": 0.024264769439988496,\n \
\ \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.024264769439988496\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.01784808957491322,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.01784808957491322\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.4180996357014574,\n\
\ \"mc2_stderr\": 0.01449797252378389\n }\n}\n```"
repo_url: https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|arc:challenge|25_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hellaswag|10_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:37:34.765508.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T09:37:34.765508.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T09:37:34.765508.parquet'
- config_name: results
data_files:
- split: 2023_07_31T09_37_34.765508
path:
- results_2023-07-31T09:37:34.765508.parquet
- split: latest
path:
- results_2023-07-31T09:37:34.765508.parquet
---
# Dataset Card for Evaluation run of beaugogh/pythia-1.4b-deduped-sharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/pythia-1.4b-deduped-sharegpt](https://huggingface.co/beaugogh/pythia-1.4b-deduped-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T09:37:34.765508](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-07-31T09%3A37%3A34.765508.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24390853423323458,
"acc_stderr": 0.030959502210233317,
"acc_norm": 0.24687570133255124,
"acc_norm_stderr": 0.03096870063976746,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.4180996357014574,
"mc2_stderr": 0.01449797252378389
},
"harness|arc:challenge|25": {
"acc": 0.29948805460750855,
"acc_stderr": 0.013385021637313565,
"acc_norm": 0.3430034129692833,
"acc_norm_stderr": 0.013872423223718166
},
"harness|hellaswag|10": {
"acc": 0.4133638717386975,
"acc_stderr": 0.0049143057985756985,
"acc_norm": 0.5449113722366062,
"acc_norm_stderr": 0.004969611554685393
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313141,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313141
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.14473684210526316,
"acc_stderr": 0.0286319518459304,
"acc_norm": 0.14473684210526316,
"acc_norm_stderr": 0.0286319518459304
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.02150209607822914,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.02150209607822914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.02627308604753542,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.02627308604753542
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429914,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.021685546665333205,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.021685546665333205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.01754937638931369,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.01754937638931369
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494033,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494033
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.015302380123542085,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.015302380123542085
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.023948512905468334,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.023948512905468334
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2090032154340836,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.2090032154340836,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.024264769439988496,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.024264769439988496
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.01784808957491322,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.01784808957491322
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.4180996357014574,
"mc2_stderr": 0.01449797252378389
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4 | 2023-08-27T12:39:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of beaugogh/Llama2-7b-sharegpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T11:50:59.260675](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-08-09T11%3A50%3A59.260675.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47540015600114766,\n\
\ \"acc_stderr\": 0.0352765312816594,\n \"acc_norm\": 0.4792802441971616,\n\
\ \"acc_norm_stderr\": 0.03525935627017046,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4574342728041311,\n\
\ \"mc2_stderr\": 0.01547770551899752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.01459587320535827,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6152160924118701,\n\
\ \"acc_stderr\": 0.004855498343308389,\n \"acc_norm\": 0.8083051185022904,\n\
\ \"acc_norm_stderr\": 0.003928298121755033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\
\ \"acc_stderr\": 0.028406095057653326,\n \"acc_norm\": 0.5258064516129032,\n\
\ \"acc_norm_stderr\": 0.028406095057653326\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5252525252525253,\n \"acc_stderr\": 0.03557806245087314,\n \"\
acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.03557806245087314\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.034478911363533815,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.034478911363533815\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.03035152732334493,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.03035152732334493\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n\
\ \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n\
\ \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668763,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668763\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35071707953063885,\n\
\ \"acc_stderr\": 0.012187773370741522,\n \"acc_norm\": 0.35071707953063885,\n\
\ \"acc_norm_stderr\": 0.012187773370741522\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159645,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159645\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4574342728041311,\n\
\ \"mc2_stderr\": 0.01547770551899752\n }\n}\n```"
repo_url: https://huggingface.co/beaugogh/Llama2-7b-sharegpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- results_2023-08-09T11:50:59.260675.parquet
- split: latest
path:
- results_2023-08-09T11:50:59.260675.parquet
---
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/Llama2-7b-sharegpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T11:50:59.260675](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-08-09T11%3A50%3A59.260675.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47540015600114766,
"acc_stderr": 0.0352765312816594,
"acc_norm": 0.4792802441971616,
"acc_norm_stderr": 0.03525935627017046,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4574342728041311,
"mc2_stderr": 0.01547770551899752
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.01459587320535827,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064664
},
"harness|hellaswag|10": {
"acc": 0.6152160924118701,
"acc_stderr": 0.004855498343308389,
"acc_norm": 0.8083051185022904,
"acc_norm_stderr": 0.003928298121755033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.028406095057653326,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.028406095057653326
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987053,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987053
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.034478911363533815,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.034478911363533815
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.03035152732334493,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.03035152732334493
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668763,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668763
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35071707953063885,
"acc_stderr": 0.012187773370741522,
"acc_norm": 0.35071707953063885,
"acc_norm_stderr": 0.012187773370741522
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159645,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159645
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4574342728041311,
"mc2_stderr": 0.01547770551899752
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_biology-neg-prepend-fix | 2023-08-21T07:34:53.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6848
num_examples: 5
- name: test
num_bytes: 953604
num_examples: 310
download_size: 15677
dataset_size: 960452
---
# Dataset Card for "mmlu-high_school_biology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16 | 2023-09-22T21:40:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikael110/llama-2-7b-guanaco-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikael110/llama-2-7b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T21:40:11.783990](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16/blob/main/results_2023-09-22T21-40-11.783990.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.00046850650303684253,\n \"f1\": 0.059943372483221714,\n\
\ \"f1_stderr\": 0.0013894963297796357,\n \"acc\": 0.40754847044560916,\n\
\ \"acc_stderr\": 0.009411574300699036\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303684253,\n\
\ \"f1\": 0.059943372483221714,\n \"f1_stderr\": 0.0013894963297796357\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \
\ \"acc_stderr\": 0.006688762581532721\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T21_40_11.783990
path:
- '**/details_harness|drop|3_2023-09-22T21-40-11.783990.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T21-40-11.783990.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T21_40_11.783990
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-40-11.783990.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-40-11.783990.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:28:02.065670.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:28:02.065670.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T21_40_11.783990
path:
- '**/details_harness|winogrande|5_2023-09-22T21-40-11.783990.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T21-40-11.783990.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_28_02.065670
path:
- results_2023-07-24T11:28:02.065670.parquet
- split: 2023_09_22T21_40_11.783990
path:
- results_2023-09-22T21-40-11.783990.parquet
- split: latest
path:
- results_2023-09-22T21-40-11.783990.parquet
---
# Dataset Card for Evaluation run of Mikael110/llama-2-7b-guanaco-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikael110/llama-2-7b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-7b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:40:11.783990](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-7b-guanaco-fp16/blob/main/results_2023-09-22T21-40-11.783990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303684253,
"f1": 0.059943372483221714,
"f1_stderr": 0.0013894963297796357,
"acc": 0.40754847044560916,
"acc_stderr": 0.009411574300699036
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303684253,
"f1": 0.059943372483221714,
"f1_stderr": 0.0013894963297796357
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.006688762581532721
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16 | 2023-08-27T12:39:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikael110/llama-2-13b-guanaco-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T14:22:01.485033](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-07-24T14%3A22%3A01.485033.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5479013613802053,\n\
\ \"acc_stderr\": 0.034515644433405324,\n \"acc_norm\": 0.5517395217798651,\n\
\ \"acc_norm_stderr\": 0.03449459574833064,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4400425009710182,\n\
\ \"mc2_stderr\": 0.015056244434717477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6369249153555069,\n\
\ \"acc_stderr\": 0.004799034356969391,\n \"acc_norm\": 0.8318064130651265,\n\
\ \"acc_norm_stderr\": 0.003732736770429724\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.02432631052914913,\n \"acc_norm\"\
: 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914913\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.632258064516129,\n \"acc_stderr\": 0.027430866579973467,\n \"\
acc_norm\": 0.632258064516129,\n \"acc_norm_stderr\": 0.027430866579973467\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.02534800603153477,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.0291022543896741,\n \"acc_norm\"\
: 0.7794117647058824,\n \"acc_norm_stderr\": 0.0291022543896741\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138608,\n \"\
acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138608\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.02760192138141758,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.02760192138141758\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.01593748465668703,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.01593748465668703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481525,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481525\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n\
\ \"acc_stderr\": 0.012573836633799013,\n \"acc_norm\": 0.41264667535853977,\n\
\ \"acc_norm_stderr\": 0.012573836633799013\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5392156862745098,\n \"acc_stderr\": 0.0201655233139079,\n \
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.0201655233139079\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.047245774057315726,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.047245774057315726\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4400425009710182,\n\
\ \"mc2_stderr\": 0.015056244434717477\n }\n}\n```"
repo_url: https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet'
- config_name: results
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- results_2023-07-24T14:22:01.485033.parquet
- split: latest
path:
- results_2023-07-24T14:22:01.485033.parquet
---
# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T14:22:01.485033](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-07-24T14%3A22%3A01.485033.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5479013613802053,
"acc_stderr": 0.034515644433405324,
"acc_norm": 0.5517395217798651,
"acc_norm_stderr": 0.03449459574833064,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4400425009710182,
"mc2_stderr": 0.015056244434717477
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.6369249153555069,
"acc_stderr": 0.004799034356969391,
"acc_norm": 0.8318064130651265,
"acc_norm_stderr": 0.003732736770429724
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.02432631052914913,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.02432631052914913
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973467,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973467
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.0291022543896741,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.0291022543896741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.02760192138141758,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.02760192138141758
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.01593748465668703,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.01593748465668703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481525,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481525
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799013,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799013
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.0201655233139079,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.0201655233139079
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.047245774057315726,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.047245774057315726
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4400425009710182,
"mc2_stderr": 0.015056244434717477
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_chemistry-neg-prepend-fix | 2023-08-21T07:35:06.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5851
num_examples: 5
- name: test
num_bytes: 425779
num_examples: 203
download_size: 13316
dataset_size: 431630
---
# Dataset Card for "mmlu-high_school_chemistry-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt | 2023-08-27T12:39:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.506231001015833,\n\
\ \"acc_stderr\": 0.03505018845563652,\n \"acc_norm\": 0.5099522031118208,\n\
\ \"acc_norm_stderr\": 0.035035258453899244,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n\
\ \"mc2_stderr\": 0.015775374488304787\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536595,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6041625174268074,\n\
\ \"acc_stderr\": 0.004880303863138504,\n \"acc_norm\": 0.7895837482573193,\n\
\ \"acc_norm_stderr\": 0.0040677125640782895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.02391998416404773,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02391998416404773\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n\
\ \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n\
\ \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.034961309720561294,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.034961309720561294\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n\
\ \"acc_stderr\": 0.01668889331080376,\n \"acc_norm\": 0.6794380587484036,\n\
\ \"acc_norm_stderr\": 0.01668889331080376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494367,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235565,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4918300653594771,\n \"acc_stderr\": 0.02022513434305726,\n \
\ \"acc_norm\": 0.4918300653594771,\n \"acc_norm_stderr\": 0.02022513434305726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512698,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512698\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n\
\ \"mc2_stderr\": 0.015775374488304787\n }\n}\n```"
repo_url: https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet'
- config_name: results
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- results_2023-08-09T21:00:12.284244.parquet
- split: latest
path:
- results_2023-08-09T21:00:12.284244.parquet
---
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.506231001015833,
"acc_stderr": 0.03505018845563652,
"acc_norm": 0.5099522031118208,
"acc_norm_stderr": 0.035035258453899244,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5317717765572597,
"mc2_stderr": 0.015775374488304787
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536595,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097664
},
"harness|hellaswag|10": {
"acc": 0.6041625174268074,
"acc_stderr": 0.004880303863138504,
"acc_norm": 0.7895837482573193,
"acc_norm_stderr": 0.0040677125640782895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02391998416404773,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02391998416404773
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.034961309720561294,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.034961309720561294
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700286,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700286
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.01668889331080376,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.01668889331080376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.02670054542494367,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.02670054542494367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260659,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235565,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4918300653594771,
"acc_stderr": 0.02022513434305726,
"acc_norm": 0.4918300653594771,
"acc_norm_stderr": 0.02022513434305726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512698,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512698
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5317717765572597,
"mc2_stderr": 0.015775374488304787
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_computer_science-neg-prepend-fix | 2023-08-21T07:35:18.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 10062
num_examples: 5
- name: test
num_bytes: 381800
num_examples: 100
download_size: 22073
dataset_size: 391862
---
# Dataset Card for "mmlu-high_school_computer_science-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF | 2023-08-27T12:39:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FPHam/Free_Sydney_13b_HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-25T10:56:58.779734](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-07-25T10%3A56%3A58.779734.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5388035903203527,\n\
\ \"acc_stderr\": 0.03440594576726577,\n \"acc_norm\": 0.5429365846106492,\n\
\ \"acc_norm_stderr\": 0.03438630541543461,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4563281678371965,\n\
\ \"mc2_stderr\": 0.014726956819650338\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.01452670554853998,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6102370045807608,\n\
\ \"acc_stderr\": 0.004866997110388195,\n \"acc_norm\": 0.813981278629755,\n\
\ \"acc_norm_stderr\": 0.0038832652107917086\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.03056159042673183,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.03056159042673183\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.018644073041375046,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.018644073041375046\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138605,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138605\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7330779054916986,\n\
\ \"acc_stderr\": 0.015818450894777545,\n \"acc_norm\": 0.7330779054916986,\n\
\ \"acc_norm_stderr\": 0.015818450894777545\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159614,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159614\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\
\ \"acc_stderr\": 0.012585471793400662,\n \"acc_norm\": 0.4152542372881356,\n\
\ \"acc_norm_stderr\": 0.012585471793400662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5261437908496732,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4563281678371965,\n\
\ \"mc2_stderr\": 0.014726956819650338\n }\n}\n```"
repo_url: https://huggingface.co/FPHam/Free_Sydney_13b_HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet'
- config_name: results
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- results_2023-07-25T10:56:58.779734.parquet
- split: latest
path:
- results_2023-07-25T10:56:58.779734.parquet
---
# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FPHam/Free_Sydney_13b_HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-25T10:56:58.779734](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-07-25T10%3A56%3A58.779734.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5388035903203527,
"acc_stderr": 0.03440594576726577,
"acc_norm": 0.5429365846106492,
"acc_norm_stderr": 0.03438630541543461,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4563281678371965,
"mc2_stderr": 0.014726956819650338
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.01452670554853998,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6102370045807608,
"acc_stderr": 0.004866997110388195,
"acc_norm": 0.813981278629755,
"acc_norm_stderr": 0.0038832652107917086
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.03056159042673183,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.03056159042673183
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.018644073041375046,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.018644073041375046
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7330779054916986,
"acc_stderr": 0.015818450894777545,
"acc_norm": 0.7330779054916986,
"acc_norm_stderr": 0.015818450894777545
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159614,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159614
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400662,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4563281678371965,
"mc2_stderr": 0.014726956819650338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Azure99__blossom-v2-3b | 2023-09-16T18:37:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Azure99/blossom-v2-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v2-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.034395973154362415,\n\
\ \"em_stderr\": 0.0018663495487686885,\n \"f1\": 0.11167470637583889,\n\
\ \"f1_stderr\": 0.0023912000923338094,\n \"acc\": 0.2966551039299941,\n\
\ \"acc_stderr\": 0.007917209289296998\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.034395973154362415,\n \"em_stderr\": 0.0018663495487686885,\n\
\ \"f1\": 0.11167470637583889,\n \"f1_stderr\": 0.0023912000923338094\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948054\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5880031570639306,\n \"acc_stderr\": 0.013833112857645942\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v2-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- results_2023-08-09T15:22:00.974376.parquet
- split: 2023_09_16T18_36_49.609194
path:
- results_2023-09-16T18-36-49.609194.parquet
- split: latest
path:
- results_2023-09-16T18-36-49.609194.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v2-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Azure99/blossom-v2-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v2-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686885,
"f1": 0.11167470637583889,
"f1_stderr": 0.0023912000923338094,
"acc": 0.2966551039299941,
"acc_stderr": 0.007917209289296998
},
"harness|drop|3": {
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686885,
"f1": 0.11167470637583889,
"f1_stderr": 0.0023912000923338094
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948054
},
"harness|winogrande|5": {
"acc": 0.5880031570639306,
"acc_stderr": 0.013833112857645942
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.